00:00:00.001 Started by upstream project "autotest-nightly-lts" build number 2029 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3289 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.001 Started by timer 00:00:00.030 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.031 The recommended git tool is: git 00:00:00.031 using credential 00000000-0000-0000-0000-000000000002 00:00:00.032 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.044 Fetching changes from the remote Git repository 00:00:00.058 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.075 Using shallow fetch with depth 1 00:00:00.075 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.075 > git --version # timeout=10 00:00:00.090 > git --version # 'git version 2.39.2' 00:00:00.090 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.103 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.103 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.145 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.155 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.165 Checking out Revision 6b67f5fa1cb27c9c410cb5dac6df31d28ba79422 (FETCH_HEAD) 00:00:03.165 > git config core.sparsecheckout # timeout=10 00:00:03.175 > git read-tree -mu HEAD # timeout=10 00:00:03.201 > git checkout -f 6b67f5fa1cb27c9c410cb5dac6df31d28ba79422 # timeout=5 00:00:03.223 Commit message: "doc: add chapter about running CI Vagrant images on dev-systems" 00:00:03.234 > git rev-list --no-walk 6b67f5fa1cb27c9c410cb5dac6df31d28ba79422 # timeout=10 00:00:03.325 [Pipeline] Start of Pipeline 00:00:03.339 [Pipeline] library 00:00:03.341 Loading library shm_lib@master 00:00:03.341 Library shm_lib@master is cached. Copying from home. 00:00:03.360 [Pipeline] node 00:00:03.377 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.379 [Pipeline] { 00:00:03.387 [Pipeline] catchError 00:00:03.389 [Pipeline] { 00:00:03.401 [Pipeline] wrap 00:00:03.411 [Pipeline] { 00:00:03.421 [Pipeline] stage 00:00:03.423 [Pipeline] { (Prologue) 00:00:03.690 [Pipeline] sh 00:00:03.973 + logger -p user.info -t JENKINS-CI 00:00:03.988 [Pipeline] echo 00:00:03.989 Node: WFP20 00:00:03.997 [Pipeline] sh 00:00:04.291 [Pipeline] setCustomBuildProperty 00:00:04.299 [Pipeline] echo 00:00:04.300 Cleanup processes 00:00:04.303 [Pipeline] sh 00:00:04.583 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.583 3021980 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.595 [Pipeline] sh 00:00:04.876 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.876 ++ grep -v 'sudo pgrep' 00:00:04.876 ++ awk '{print $1}' 00:00:04.876 + sudo kill -9 00:00:04.876 + true 00:00:04.890 [Pipeline] cleanWs 00:00:04.899 [WS-CLEANUP] Deleting project workspace... 00:00:04.899 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.911 [WS-CLEANUP] done 00:00:04.915 [Pipeline] setCustomBuildProperty 00:00:04.931 [Pipeline] sh 00:00:05.218 + sudo git config --global --replace-all safe.directory '*' 00:00:05.289 [Pipeline] httpRequest 00:00:05.323 [Pipeline] echo 00:00:05.324 Sorcerer 10.211.164.101 is alive 00:00:05.331 [Pipeline] httpRequest 00:00:05.334 HttpMethod: GET 00:00:05.335 URL: http://10.211.164.101/packages/jbp_6b67f5fa1cb27c9c410cb5dac6df31d28ba79422.tar.gz 00:00:05.335 Sending request to url: http://10.211.164.101/packages/jbp_6b67f5fa1cb27c9c410cb5dac6df31d28ba79422.tar.gz 00:00:05.338 Response Code: HTTP/1.1 200 OK 00:00:05.338 Success: Status code 200 is in the accepted range: 200,404 00:00:05.338 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_6b67f5fa1cb27c9c410cb5dac6df31d28ba79422.tar.gz 00:00:06.038 [Pipeline] sh 00:00:06.321 + tar --no-same-owner -xf jbp_6b67f5fa1cb27c9c410cb5dac6df31d28ba79422.tar.gz 00:00:06.334 [Pipeline] httpRequest 00:00:06.375 [Pipeline] echo 00:00:06.376 Sorcerer 10.211.164.101 is alive 00:00:06.382 [Pipeline] httpRequest 00:00:06.385 HttpMethod: GET 00:00:06.386 URL: http://10.211.164.101/packages/spdk_dbef7efacb6f3438cd0fe1344a67946669fb1419.tar.gz 00:00:06.386 Sending request to url: http://10.211.164.101/packages/spdk_dbef7efacb6f3438cd0fe1344a67946669fb1419.tar.gz 00:00:06.404 Response Code: HTTP/1.1 200 OK 00:00:06.405 Success: Status code 200 is in the accepted range: 200,404 00:00:06.405 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_dbef7efacb6f3438cd0fe1344a67946669fb1419.tar.gz 00:01:21.358 [Pipeline] sh 00:01:21.661 + tar --no-same-owner -xf spdk_dbef7efacb6f3438cd0fe1344a67946669fb1419.tar.gz 00:01:24.967 [Pipeline] sh 00:01:25.254 + git -C spdk log --oneline -n5 00:01:25.254 dbef7efac test: fix dpdk builds on ubuntu24 00:01:25.254 4b94202c6 lib/event: Bug fix for framework_set_scheduler 00:01:25.254 507e9ba07 nvme: add lock_depth for ctrlr_lock 00:01:25.254 62fda7b5f nvme: check pthread_mutex_destroy() return value 00:01:25.254 e03c164a1 nvme: add nvme_ctrlr_lock 00:01:25.266 [Pipeline] } 00:01:25.284 [Pipeline] // stage 00:01:25.294 [Pipeline] stage 00:01:25.297 [Pipeline] { (Prepare) 00:01:25.316 [Pipeline] writeFile 00:01:25.333 [Pipeline] sh 00:01:25.619 + logger -p user.info -t JENKINS-CI 00:01:25.631 [Pipeline] sh 00:01:25.917 + logger -p user.info -t JENKINS-CI 00:01:25.930 [Pipeline] sh 00:01:26.214 + cat autorun-spdk.conf 00:01:26.214 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:26.214 SPDK_TEST_FUZZER_SHORT=1 00:01:26.214 SPDK_TEST_FUZZER=1 00:01:26.214 SPDK_RUN_UBSAN=1 00:01:26.222 RUN_NIGHTLY=1 00:01:26.226 [Pipeline] readFile 00:01:26.254 [Pipeline] withEnv 00:01:26.256 [Pipeline] { 00:01:26.272 [Pipeline] sh 00:01:26.557 + set -ex 00:01:26.557 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:26.557 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:26.557 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:26.557 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:26.557 ++ SPDK_TEST_FUZZER=1 00:01:26.557 ++ SPDK_RUN_UBSAN=1 00:01:26.557 ++ RUN_NIGHTLY=1 00:01:26.557 + case $SPDK_TEST_NVMF_NICS in 00:01:26.557 + DRIVERS= 00:01:26.557 + [[ -n '' ]] 00:01:26.557 + exit 0 00:01:26.566 [Pipeline] } 00:01:26.585 [Pipeline] // withEnv 00:01:26.590 [Pipeline] } 00:01:26.607 [Pipeline] // stage 00:01:26.616 [Pipeline] catchError 00:01:26.618 [Pipeline] { 00:01:26.633 [Pipeline] timeout 00:01:26.633 Timeout set to expire in 30 min 00:01:26.635 [Pipeline] { 00:01:26.650 [Pipeline] stage 00:01:26.652 [Pipeline] { (Tests) 00:01:26.666 [Pipeline] sh 00:01:26.950 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:26.950 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:26.950 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:26.950 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:26.950 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:26.950 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:26.950 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:26.950 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:26.950 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:26.950 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:26.950 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:26.950 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:26.950 + source /etc/os-release 00:01:26.950 ++ NAME='Fedora Linux' 00:01:26.950 ++ VERSION='38 (Cloud Edition)' 00:01:26.950 ++ ID=fedora 00:01:26.950 ++ VERSION_ID=38 00:01:26.950 ++ VERSION_CODENAME= 00:01:26.950 ++ PLATFORM_ID=platform:f38 00:01:26.950 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:26.950 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:26.950 ++ LOGO=fedora-logo-icon 00:01:26.950 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:26.950 ++ HOME_URL=https://fedoraproject.org/ 00:01:26.950 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:26.950 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:26.950 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:26.950 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:26.950 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:26.950 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:26.950 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:26.950 ++ SUPPORT_END=2024-05-14 00:01:26.950 ++ VARIANT='Cloud Edition' 00:01:26.950 ++ VARIANT_ID=cloud 00:01:26.950 + uname -a 00:01:26.950 Linux spdk-wfp-20 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:26.950 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:29.488 Hugepages 00:01:29.488 node hugesize free / total 00:01:29.488 node0 1048576kB 0 / 0 00:01:29.488 node0 2048kB 0 / 0 00:01:29.488 node1 1048576kB 0 / 0 00:01:29.488 node1 2048kB 0 / 0 00:01:29.488 00:01:29.488 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:29.488 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:29.488 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:29.488 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:29.488 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:29.488 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:29.488 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:29.488 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:29.488 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:29.488 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:29.488 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:29.488 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:29.488 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:29.488 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:29.488 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:29.488 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:29.488 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:29.488 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:29.488 + rm -f /tmp/spdk-ld-path 00:01:29.488 + source autorun-spdk.conf 00:01:29.488 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:29.488 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:29.488 ++ SPDK_TEST_FUZZER=1 00:01:29.488 ++ SPDK_RUN_UBSAN=1 00:01:29.488 ++ RUN_NIGHTLY=1 00:01:29.488 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:29.488 + [[ -n '' ]] 00:01:29.488 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:29.488 + for M in /var/spdk/build-*-manifest.txt 00:01:29.488 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:29.488 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:29.488 + for M in /var/spdk/build-*-manifest.txt 00:01:29.488 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:29.488 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:29.488 ++ uname 00:01:29.488 + [[ Linux == \L\i\n\u\x ]] 00:01:29.488 + sudo dmesg -T 00:01:29.488 + sudo dmesg --clear 00:01:29.488 + dmesg_pid=3023437 00:01:29.488 + [[ Fedora Linux == FreeBSD ]] 00:01:29.488 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:29.488 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:29.488 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:29.488 + export VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:29.488 + VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:29.488 + [[ -x /usr/src/fio-static/fio ]] 00:01:29.488 + export FIO_BIN=/usr/src/fio-static/fio 00:01:29.488 + FIO_BIN=/usr/src/fio-static/fio 00:01:29.488 + sudo dmesg -Tw 00:01:29.488 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:29.488 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:29.488 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:29.488 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:29.488 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:29.488 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:29.488 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:29.488 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:29.488 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:29.488 Test configuration: 00:01:29.488 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:29.488 SPDK_TEST_FUZZER_SHORT=1 00:01:29.488 SPDK_TEST_FUZZER=1 00:01:29.488 SPDK_RUN_UBSAN=1 00:01:29.748 RUN_NIGHTLY=1 05:01:00 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:29.748 05:01:00 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:29.748 05:01:00 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:29.748 05:01:00 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:29.748 05:01:00 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:29.748 05:01:00 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:29.748 05:01:00 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:29.748 05:01:00 -- paths/export.sh@5 -- $ export PATH 00:01:29.748 05:01:00 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:29.748 05:01:00 -- common/autobuild_common.sh@437 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:29.748 05:01:00 -- common/autobuild_common.sh@438 -- $ date +%s 00:01:29.748 05:01:00 -- common/autobuild_common.sh@438 -- $ mktemp -dt spdk_1721703660.XXXXXX 00:01:29.748 05:01:00 -- common/autobuild_common.sh@438 -- $ SPDK_WORKSPACE=/tmp/spdk_1721703660.McQqmJ 00:01:29.748 05:01:00 -- common/autobuild_common.sh@440 -- $ [[ -n '' ]] 00:01:29.748 05:01:00 -- common/autobuild_common.sh@444 -- $ '[' -n '' ']' 00:01:29.748 05:01:00 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:01:29.748 05:01:00 -- common/autobuild_common.sh@451 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:29.748 05:01:00 -- common/autobuild_common.sh@453 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:29.748 05:01:00 -- common/autobuild_common.sh@454 -- $ get_config_params 00:01:29.748 05:01:00 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:01:29.748 05:01:00 -- common/autotest_common.sh@10 -- $ set +x 00:01:29.748 05:01:00 -- common/autobuild_common.sh@454 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:01:29.748 05:01:00 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:29.748 05:01:00 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:29.748 05:01:00 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:29.748 05:01:00 -- spdk/autobuild.sh@16 -- $ date -u 00:01:29.748 Tue Jul 23 03:01:00 AM UTC 2024 00:01:29.748 05:01:00 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:29.748 LTS-60-gdbef7efac 00:01:29.748 05:01:00 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:29.748 05:01:00 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:29.748 05:01:00 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:29.748 05:01:00 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:01:29.748 05:01:00 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:29.748 05:01:00 -- common/autotest_common.sh@10 -- $ set +x 00:01:29.748 ************************************ 00:01:29.748 START TEST ubsan 00:01:29.748 ************************************ 00:01:29.748 05:01:00 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:01:29.748 using ubsan 00:01:29.748 00:01:29.748 real 0m0.000s 00:01:29.748 user 0m0.000s 00:01:29.748 sys 0m0.000s 00:01:29.748 05:01:00 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:29.748 05:01:00 -- common/autotest_common.sh@10 -- $ set +x 00:01:29.748 ************************************ 00:01:29.748 END TEST ubsan 00:01:29.748 ************************************ 00:01:29.748 05:01:00 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:29.748 05:01:00 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:29.748 05:01:00 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:29.748 05:01:00 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:29.748 05:01:00 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:29.748 05:01:00 -- common/autobuild_common.sh@426 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:29.748 05:01:00 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:29.748 05:01:00 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:29.748 05:01:00 -- common/autotest_common.sh@10 -- $ set +x 00:01:29.748 ************************************ 00:01:29.749 START TEST autobuild_llvm_precompile 00:01:29.749 ************************************ 00:01:29.749 05:01:00 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:01:29.749 05:01:00 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:29.749 05:01:00 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:01:29.749 Target: x86_64-redhat-linux-gnu 00:01:29.749 Thread model: posix 00:01:29.749 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:29.749 05:01:00 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:01:29.749 05:01:00 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:01:29.749 05:01:00 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:01:29.749 05:01:00 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:01:29.749 05:01:00 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:01:29.749 05:01:00 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a) 00:01:29.749 05:01:00 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:29.749 05:01:00 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:01:29.749 05:01:00 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:01:29.749 05:01:00 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:30.007 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:30.007 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:30.584 Using 'verbs' RDMA provider 00:01:46.412 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:01.302 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:01.302 Creating mk/config.mk...done. 00:02:01.302 Creating mk/cc.flags.mk...done. 00:02:01.302 Type 'make' to build. 00:02:01.302 00:02:01.302 real 0m31.410s 00:02:01.302 user 0m14.043s 00:02:01.302 sys 0m16.766s 00:02:01.303 05:01:32 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:01.303 05:01:32 -- common/autotest_common.sh@10 -- $ set +x 00:02:01.303 ************************************ 00:02:01.303 END TEST autobuild_llvm_precompile 00:02:01.303 ************************************ 00:02:01.303 05:01:32 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:01.303 05:01:32 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:01.303 05:01:32 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:01.303 05:01:32 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:01.303 05:01:32 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:01.562 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:01.562 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:02:02.131 Using 'verbs' RDMA provider 00:02:17.658 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:29.874 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:29.874 Creating mk/config.mk...done. 00:02:29.874 Creating mk/cc.flags.mk...done. 00:02:29.874 Type 'make' to build. 00:02:29.874 05:02:00 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:29.874 05:02:00 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:02:29.874 05:02:00 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:29.874 05:02:00 -- common/autotest_common.sh@10 -- $ set +x 00:02:29.874 ************************************ 00:02:29.874 START TEST make 00:02:29.874 ************************************ 00:02:29.874 05:02:00 -- common/autotest_common.sh@1104 -- $ make -j112 00:02:29.874 make[1]: Nothing to be done for 'all'. 00:02:32.408 The Meson build system 00:02:32.408 Version: 1.3.1 00:02:32.408 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:32.408 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:32.408 Build type: native build 00:02:32.408 Project name: libvfio-user 00:02:32.408 Project version: 0.0.1 00:02:32.408 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:02:32.408 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:02:32.408 Host machine cpu family: x86_64 00:02:32.408 Host machine cpu: x86_64 00:02:32.408 Run-time dependency threads found: YES 00:02:32.408 Library dl found: YES 00:02:32.408 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:32.408 Run-time dependency json-c found: YES 0.17 00:02:32.408 Run-time dependency cmocka found: YES 1.1.7 00:02:32.408 Program pytest-3 found: NO 00:02:32.408 Program flake8 found: NO 00:02:32.408 Program misspell-fixer found: NO 00:02:32.408 Program restructuredtext-lint found: NO 00:02:32.408 Program valgrind found: YES (/usr/bin/valgrind) 00:02:32.408 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:32.408 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:32.408 Compiler for C supports arguments -Wwrite-strings: YES 00:02:32.408 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:32.408 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:32.408 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:32.408 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:32.408 Build targets in project: 8 00:02:32.408 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:32.408 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:32.408 00:02:32.408 libvfio-user 0.0.1 00:02:32.408 00:02:32.408 User defined options 00:02:32.408 buildtype : debug 00:02:32.408 default_library: static 00:02:32.408 libdir : /usr/local/lib 00:02:32.408 00:02:32.408 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:32.408 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:32.408 [1/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:32.408 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:32.408 [3/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:32.408 [4/36] Compiling C object samples/null.p/null.c.o 00:02:32.408 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:32.408 [6/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:32.408 [7/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:32.408 [8/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:32.408 [9/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:32.408 [10/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:32.408 [11/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:32.408 [12/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:32.408 [13/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:32.408 [14/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:32.408 [15/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:32.408 [16/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:32.408 [17/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:32.408 [18/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:32.408 [19/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:32.408 [20/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:32.408 [21/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:32.408 [22/36] Compiling C object samples/server.p/server.c.o 00:02:32.667 [23/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:32.667 [24/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:32.667 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:32.667 [26/36] Compiling C object samples/client.p/client.c.o 00:02:32.667 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:32.667 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:32.667 [29/36] Linking static target lib/libvfio-user.a 00:02:32.667 [30/36] Linking target samples/client 00:02:32.667 [31/36] Linking target test/unit_tests 00:02:32.667 [32/36] Linking target samples/shadow_ioeventfd_server 00:02:32.667 [33/36] Linking target samples/server 00:02:32.667 [34/36] Linking target samples/lspci 00:02:32.667 [35/36] Linking target samples/null 00:02:32.667 [36/36] Linking target samples/gpio-pci-idio-16 00:02:32.667 INFO: autodetecting backend as ninja 00:02:32.667 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:32.667 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:33.604 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:33.604 ninja: no work to do. 00:02:38.880 The Meson build system 00:02:38.880 Version: 1.3.1 00:02:38.880 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:02:38.880 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:02:38.880 Build type: native build 00:02:38.880 Program cat found: YES (/usr/bin/cat) 00:02:38.880 Project name: DPDK 00:02:38.880 Project version: 23.11.0 00:02:38.880 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:02:38.880 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:02:38.880 Host machine cpu family: x86_64 00:02:38.880 Host machine cpu: x86_64 00:02:38.880 Message: ## Building in Developer Mode ## 00:02:38.880 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:38.880 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:38.880 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:38.880 Program python3 found: YES (/usr/bin/python3) 00:02:38.880 Program cat found: YES (/usr/bin/cat) 00:02:38.880 Compiler for C supports arguments -march=native: YES 00:02:38.880 Checking for size of "void *" : 8 00:02:38.880 Checking for size of "void *" : 8 (cached) 00:02:38.880 Library m found: YES 00:02:38.880 Library numa found: YES 00:02:38.880 Has header "numaif.h" : YES 00:02:38.880 Library fdt found: NO 00:02:38.880 Library execinfo found: NO 00:02:38.880 Has header "execinfo.h" : YES 00:02:38.880 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:38.880 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:38.880 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:38.880 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:38.880 Run-time dependency openssl found: YES 3.0.9 00:02:38.880 Run-time dependency libpcap found: YES 1.10.4 00:02:38.880 Has header "pcap.h" with dependency libpcap: YES 00:02:38.880 Compiler for C supports arguments -Wcast-qual: YES 00:02:38.880 Compiler for C supports arguments -Wdeprecated: YES 00:02:38.880 Compiler for C supports arguments -Wformat: YES 00:02:38.880 Compiler for C supports arguments -Wformat-nonliteral: YES 00:02:38.880 Compiler for C supports arguments -Wformat-security: YES 00:02:38.880 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:38.880 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:38.880 Compiler for C supports arguments -Wnested-externs: YES 00:02:38.880 Compiler for C supports arguments -Wold-style-definition: YES 00:02:38.880 Compiler for C supports arguments -Wpointer-arith: YES 00:02:38.880 Compiler for C supports arguments -Wsign-compare: YES 00:02:38.880 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:38.880 Compiler for C supports arguments -Wundef: YES 00:02:38.880 Compiler for C supports arguments -Wwrite-strings: YES 00:02:38.880 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:38.880 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:02:38.880 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:38.880 Program objdump found: YES (/usr/bin/objdump) 00:02:38.880 Compiler for C supports arguments -mavx512f: YES 00:02:38.880 Checking if "AVX512 checking" compiles: YES 00:02:38.880 Fetching value of define "__SSE4_2__" : 1 00:02:38.880 Fetching value of define "__AES__" : 1 00:02:38.880 Fetching value of define "__AVX__" : 1 00:02:38.880 Fetching value of define "__AVX2__" : 1 00:02:38.880 Fetching value of define "__AVX512BW__" : 1 00:02:38.880 Fetching value of define "__AVX512CD__" : 1 00:02:38.880 Fetching value of define "__AVX512DQ__" : 1 00:02:38.880 Fetching value of define "__AVX512F__" : 1 00:02:38.880 Fetching value of define "__AVX512VL__" : 1 00:02:38.880 Fetching value of define "__PCLMUL__" : 1 00:02:38.880 Fetching value of define "__RDRND__" : 1 00:02:38.880 Fetching value of define "__RDSEED__" : 1 00:02:38.880 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:38.880 Fetching value of define "__znver1__" : (undefined) 00:02:38.880 Fetching value of define "__znver2__" : (undefined) 00:02:38.880 Fetching value of define "__znver3__" : (undefined) 00:02:38.880 Fetching value of define "__znver4__" : (undefined) 00:02:38.880 Compiler for C supports arguments -Wno-format-truncation: NO 00:02:38.880 Message: lib/log: Defining dependency "log" 00:02:38.880 Message: lib/kvargs: Defining dependency "kvargs" 00:02:38.880 Message: lib/telemetry: Defining dependency "telemetry" 00:02:38.880 Checking for function "getentropy" : NO 00:02:38.880 Message: lib/eal: Defining dependency "eal" 00:02:38.880 Message: lib/ring: Defining dependency "ring" 00:02:38.880 Message: lib/rcu: Defining dependency "rcu" 00:02:38.880 Message: lib/mempool: Defining dependency "mempool" 00:02:38.880 Message: lib/mbuf: Defining dependency "mbuf" 00:02:38.880 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:38.880 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:38.880 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:38.880 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:38.880 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:38.880 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:38.880 Compiler for C supports arguments -mpclmul: YES 00:02:38.880 Compiler for C supports arguments -maes: YES 00:02:38.880 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:38.880 Compiler for C supports arguments -mavx512bw: YES 00:02:38.881 Compiler for C supports arguments -mavx512dq: YES 00:02:38.881 Compiler for C supports arguments -mavx512vl: YES 00:02:38.881 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:38.881 Compiler for C supports arguments -mavx2: YES 00:02:38.881 Compiler for C supports arguments -mavx: YES 00:02:38.881 Message: lib/net: Defining dependency "net" 00:02:38.881 Message: lib/meter: Defining dependency "meter" 00:02:38.881 Message: lib/ethdev: Defining dependency "ethdev" 00:02:38.881 Message: lib/pci: Defining dependency "pci" 00:02:38.881 Message: lib/cmdline: Defining dependency "cmdline" 00:02:38.881 Message: lib/hash: Defining dependency "hash" 00:02:38.881 Message: lib/timer: Defining dependency "timer" 00:02:38.881 Message: lib/compressdev: Defining dependency "compressdev" 00:02:38.881 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:38.881 Message: lib/dmadev: Defining dependency "dmadev" 00:02:38.881 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:38.881 Message: lib/power: Defining dependency "power" 00:02:38.881 Message: lib/reorder: Defining dependency "reorder" 00:02:38.881 Message: lib/security: Defining dependency "security" 00:02:38.881 Has header "linux/userfaultfd.h" : YES 00:02:38.881 Has header "linux/vduse.h" : YES 00:02:38.881 Message: lib/vhost: Defining dependency "vhost" 00:02:38.881 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:02:38.881 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:38.881 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:38.881 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:38.881 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:38.881 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:38.881 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:38.881 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:38.881 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:38.881 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:38.881 Program doxygen found: YES (/usr/bin/doxygen) 00:02:38.881 Configuring doxy-api-html.conf using configuration 00:02:38.881 Configuring doxy-api-man.conf using configuration 00:02:38.881 Program mandb found: YES (/usr/bin/mandb) 00:02:38.881 Program sphinx-build found: NO 00:02:38.881 Configuring rte_build_config.h using configuration 00:02:38.881 Message: 00:02:38.881 ================= 00:02:38.881 Applications Enabled 00:02:38.881 ================= 00:02:38.881 00:02:38.881 apps: 00:02:38.881 00:02:38.881 00:02:38.881 Message: 00:02:38.881 ================= 00:02:38.881 Libraries Enabled 00:02:38.881 ================= 00:02:38.881 00:02:38.881 libs: 00:02:38.881 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:38.881 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:38.881 cryptodev, dmadev, power, reorder, security, vhost, 00:02:38.881 00:02:38.881 Message: 00:02:38.881 =============== 00:02:38.881 Drivers Enabled 00:02:38.881 =============== 00:02:38.881 00:02:38.881 common: 00:02:38.881 00:02:38.881 bus: 00:02:38.881 pci, vdev, 00:02:38.881 mempool: 00:02:38.881 ring, 00:02:38.881 dma: 00:02:38.881 00:02:38.881 net: 00:02:38.881 00:02:38.881 crypto: 00:02:38.881 00:02:38.881 compress: 00:02:38.881 00:02:38.881 vdpa: 00:02:38.881 00:02:38.881 00:02:38.881 Message: 00:02:38.881 ================= 00:02:38.881 Content Skipped 00:02:38.881 ================= 00:02:38.881 00:02:38.881 apps: 00:02:38.881 dumpcap: explicitly disabled via build config 00:02:38.881 graph: explicitly disabled via build config 00:02:38.881 pdump: explicitly disabled via build config 00:02:38.881 proc-info: explicitly disabled via build config 00:02:38.881 test-acl: explicitly disabled via build config 00:02:38.881 test-bbdev: explicitly disabled via build config 00:02:38.881 test-cmdline: explicitly disabled via build config 00:02:38.881 test-compress-perf: explicitly disabled via build config 00:02:38.881 test-crypto-perf: explicitly disabled via build config 00:02:38.881 test-dma-perf: explicitly disabled via build config 00:02:38.881 test-eventdev: explicitly disabled via build config 00:02:38.881 test-fib: explicitly disabled via build config 00:02:38.881 test-flow-perf: explicitly disabled via build config 00:02:38.881 test-gpudev: explicitly disabled via build config 00:02:38.881 test-mldev: explicitly disabled via build config 00:02:38.881 test-pipeline: explicitly disabled via build config 00:02:38.881 test-pmd: explicitly disabled via build config 00:02:38.881 test-regex: explicitly disabled via build config 00:02:38.881 test-sad: explicitly disabled via build config 00:02:38.881 test-security-perf: explicitly disabled via build config 00:02:38.881 00:02:38.881 libs: 00:02:38.881 metrics: explicitly disabled via build config 00:02:38.881 acl: explicitly disabled via build config 00:02:38.881 bbdev: explicitly disabled via build config 00:02:38.881 bitratestats: explicitly disabled via build config 00:02:38.881 bpf: explicitly disabled via build config 00:02:38.881 cfgfile: explicitly disabled via build config 00:02:38.881 distributor: explicitly disabled via build config 00:02:38.881 efd: explicitly disabled via build config 00:02:38.881 eventdev: explicitly disabled via build config 00:02:38.881 dispatcher: explicitly disabled via build config 00:02:38.881 gpudev: explicitly disabled via build config 00:02:38.881 gro: explicitly disabled via build config 00:02:38.881 gso: explicitly disabled via build config 00:02:38.881 ip_frag: explicitly disabled via build config 00:02:38.881 jobstats: explicitly disabled via build config 00:02:38.881 latencystats: explicitly disabled via build config 00:02:38.881 lpm: explicitly disabled via build config 00:02:38.881 member: explicitly disabled via build config 00:02:38.881 pcapng: explicitly disabled via build config 00:02:38.881 rawdev: explicitly disabled via build config 00:02:38.881 regexdev: explicitly disabled via build config 00:02:38.881 mldev: explicitly disabled via build config 00:02:38.881 rib: explicitly disabled via build config 00:02:38.881 sched: explicitly disabled via build config 00:02:38.881 stack: explicitly disabled via build config 00:02:38.881 ipsec: explicitly disabled via build config 00:02:38.881 pdcp: explicitly disabled via build config 00:02:38.881 fib: explicitly disabled via build config 00:02:38.881 port: explicitly disabled via build config 00:02:38.881 pdump: explicitly disabled via build config 00:02:38.881 table: explicitly disabled via build config 00:02:38.881 pipeline: explicitly disabled via build config 00:02:38.881 graph: explicitly disabled via build config 00:02:38.881 node: explicitly disabled via build config 00:02:38.881 00:02:38.881 drivers: 00:02:38.881 common/cpt: not in enabled drivers build config 00:02:38.881 common/dpaax: not in enabled drivers build config 00:02:38.881 common/iavf: not in enabled drivers build config 00:02:38.881 common/idpf: not in enabled drivers build config 00:02:38.881 common/mvep: not in enabled drivers build config 00:02:38.881 common/octeontx: not in enabled drivers build config 00:02:38.881 bus/auxiliary: not in enabled drivers build config 00:02:38.881 bus/cdx: not in enabled drivers build config 00:02:38.881 bus/dpaa: not in enabled drivers build config 00:02:38.881 bus/fslmc: not in enabled drivers build config 00:02:38.881 bus/ifpga: not in enabled drivers build config 00:02:38.881 bus/platform: not in enabled drivers build config 00:02:38.881 bus/vmbus: not in enabled drivers build config 00:02:38.881 common/cnxk: not in enabled drivers build config 00:02:38.881 common/mlx5: not in enabled drivers build config 00:02:38.881 common/nfp: not in enabled drivers build config 00:02:38.881 common/qat: not in enabled drivers build config 00:02:38.881 common/sfc_efx: not in enabled drivers build config 00:02:38.881 mempool/bucket: not in enabled drivers build config 00:02:38.881 mempool/cnxk: not in enabled drivers build config 00:02:38.881 mempool/dpaa: not in enabled drivers build config 00:02:38.881 mempool/dpaa2: not in enabled drivers build config 00:02:38.881 mempool/octeontx: not in enabled drivers build config 00:02:38.881 mempool/stack: not in enabled drivers build config 00:02:38.881 dma/cnxk: not in enabled drivers build config 00:02:38.881 dma/dpaa: not in enabled drivers build config 00:02:38.881 dma/dpaa2: not in enabled drivers build config 00:02:38.881 dma/hisilicon: not in enabled drivers build config 00:02:38.881 dma/idxd: not in enabled drivers build config 00:02:38.881 dma/ioat: not in enabled drivers build config 00:02:38.881 dma/skeleton: not in enabled drivers build config 00:02:38.881 net/af_packet: not in enabled drivers build config 00:02:38.881 net/af_xdp: not in enabled drivers build config 00:02:38.881 net/ark: not in enabled drivers build config 00:02:38.881 net/atlantic: not in enabled drivers build config 00:02:38.881 net/avp: not in enabled drivers build config 00:02:38.881 net/axgbe: not in enabled drivers build config 00:02:38.881 net/bnx2x: not in enabled drivers build config 00:02:38.881 net/bnxt: not in enabled drivers build config 00:02:38.881 net/bonding: not in enabled drivers build config 00:02:38.881 net/cnxk: not in enabled drivers build config 00:02:38.881 net/cpfl: not in enabled drivers build config 00:02:38.881 net/cxgbe: not in enabled drivers build config 00:02:38.881 net/dpaa: not in enabled drivers build config 00:02:38.881 net/dpaa2: not in enabled drivers build config 00:02:38.881 net/e1000: not in enabled drivers build config 00:02:38.881 net/ena: not in enabled drivers build config 00:02:38.881 net/enetc: not in enabled drivers build config 00:02:38.881 net/enetfec: not in enabled drivers build config 00:02:38.881 net/enic: not in enabled drivers build config 00:02:38.881 net/failsafe: not in enabled drivers build config 00:02:38.881 net/fm10k: not in enabled drivers build config 00:02:38.881 net/gve: not in enabled drivers build config 00:02:38.881 net/hinic: not in enabled drivers build config 00:02:38.881 net/hns3: not in enabled drivers build config 00:02:38.881 net/i40e: not in enabled drivers build config 00:02:38.881 net/iavf: not in enabled drivers build config 00:02:38.881 net/ice: not in enabled drivers build config 00:02:38.881 net/idpf: not in enabled drivers build config 00:02:38.881 net/igc: not in enabled drivers build config 00:02:38.882 net/ionic: not in enabled drivers build config 00:02:38.882 net/ipn3ke: not in enabled drivers build config 00:02:38.882 net/ixgbe: not in enabled drivers build config 00:02:38.882 net/mana: not in enabled drivers build config 00:02:38.882 net/memif: not in enabled drivers build config 00:02:38.882 net/mlx4: not in enabled drivers build config 00:02:38.882 net/mlx5: not in enabled drivers build config 00:02:38.882 net/mvneta: not in enabled drivers build config 00:02:38.882 net/mvpp2: not in enabled drivers build config 00:02:38.882 net/netvsc: not in enabled drivers build config 00:02:38.882 net/nfb: not in enabled drivers build config 00:02:38.882 net/nfp: not in enabled drivers build config 00:02:38.882 net/ngbe: not in enabled drivers build config 00:02:38.882 net/null: not in enabled drivers build config 00:02:38.882 net/octeontx: not in enabled drivers build config 00:02:38.882 net/octeon_ep: not in enabled drivers build config 00:02:38.882 net/pcap: not in enabled drivers build config 00:02:38.882 net/pfe: not in enabled drivers build config 00:02:38.882 net/qede: not in enabled drivers build config 00:02:38.882 net/ring: not in enabled drivers build config 00:02:38.882 net/sfc: not in enabled drivers build config 00:02:38.882 net/softnic: not in enabled drivers build config 00:02:38.882 net/tap: not in enabled drivers build config 00:02:38.882 net/thunderx: not in enabled drivers build config 00:02:38.882 net/txgbe: not in enabled drivers build config 00:02:38.882 net/vdev_netvsc: not in enabled drivers build config 00:02:38.882 net/vhost: not in enabled drivers build config 00:02:38.882 net/virtio: not in enabled drivers build config 00:02:38.882 net/vmxnet3: not in enabled drivers build config 00:02:38.882 raw/*: missing internal dependency, "rawdev" 00:02:38.882 crypto/armv8: not in enabled drivers build config 00:02:38.882 crypto/bcmfs: not in enabled drivers build config 00:02:38.882 crypto/caam_jr: not in enabled drivers build config 00:02:38.882 crypto/ccp: not in enabled drivers build config 00:02:38.882 crypto/cnxk: not in enabled drivers build config 00:02:38.882 crypto/dpaa_sec: not in enabled drivers build config 00:02:38.882 crypto/dpaa2_sec: not in enabled drivers build config 00:02:38.882 crypto/ipsec_mb: not in enabled drivers build config 00:02:38.882 crypto/mlx5: not in enabled drivers build config 00:02:38.882 crypto/mvsam: not in enabled drivers build config 00:02:38.882 crypto/nitrox: not in enabled drivers build config 00:02:38.882 crypto/null: not in enabled drivers build config 00:02:38.882 crypto/octeontx: not in enabled drivers build config 00:02:38.882 crypto/openssl: not in enabled drivers build config 00:02:38.882 crypto/scheduler: not in enabled drivers build config 00:02:38.882 crypto/uadk: not in enabled drivers build config 00:02:38.882 crypto/virtio: not in enabled drivers build config 00:02:38.882 compress/isal: not in enabled drivers build config 00:02:38.882 compress/mlx5: not in enabled drivers build config 00:02:38.882 compress/octeontx: not in enabled drivers build config 00:02:38.882 compress/zlib: not in enabled drivers build config 00:02:38.882 regex/*: missing internal dependency, "regexdev" 00:02:38.882 ml/*: missing internal dependency, "mldev" 00:02:38.882 vdpa/ifc: not in enabled drivers build config 00:02:38.882 vdpa/mlx5: not in enabled drivers build config 00:02:38.882 vdpa/nfp: not in enabled drivers build config 00:02:38.882 vdpa/sfc: not in enabled drivers build config 00:02:38.882 event/*: missing internal dependency, "eventdev" 00:02:38.882 baseband/*: missing internal dependency, "bbdev" 00:02:38.882 gpu/*: missing internal dependency, "gpudev" 00:02:38.882 00:02:38.882 00:02:38.882 Build targets in project: 85 00:02:38.882 00:02:38.882 DPDK 23.11.0 00:02:38.882 00:02:38.882 User defined options 00:02:38.882 buildtype : debug 00:02:38.882 default_library : static 00:02:38.882 libdir : lib 00:02:38.882 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:02:38.882 c_args : -fPIC -Werror 00:02:38.882 c_link_args : 00:02:38.882 cpu_instruction_set: native 00:02:38.882 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:02:38.882 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,pcapng,bbdev 00:02:38.882 enable_docs : false 00:02:38.882 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:02:38.882 enable_kmods : false 00:02:38.882 tests : false 00:02:38.882 00:02:38.882 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:39.452 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:02:39.715 [1/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:39.715 [2/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:39.715 [3/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:39.715 [4/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:39.715 [5/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:39.715 [6/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:39.715 [7/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:39.715 [8/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:39.715 [9/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:39.715 [10/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:39.715 [11/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:39.715 [12/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:39.715 [13/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:39.715 [14/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:39.715 [15/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:39.715 [16/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:39.715 [17/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:39.715 [18/265] Linking static target lib/librte_kvargs.a 00:02:39.715 [19/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:39.715 [20/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:39.715 [21/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:39.715 [22/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:39.715 [23/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:39.715 [24/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:39.715 [25/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:39.715 [26/265] Linking static target lib/librte_log.a 00:02:39.715 [27/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:39.715 [28/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:39.715 [29/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:39.715 [30/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:39.715 [31/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:39.715 [32/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:39.715 [33/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:39.715 [34/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:39.715 [35/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:39.715 [36/265] Linking static target lib/librte_pci.a 00:02:39.715 [37/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:39.715 [38/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:39.715 [39/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:39.715 [40/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:39.974 [41/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:39.974 [42/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.974 [43/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.974 [44/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:39.974 [45/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:40.232 [46/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:40.232 [47/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:40.232 [48/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:40.232 [49/265] Linking static target lib/librte_meter.a 00:02:40.232 [50/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:40.232 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:40.232 [52/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:40.232 [53/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:40.232 [54/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:40.232 [55/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:40.232 [56/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:40.232 [57/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:40.232 [58/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:40.232 [59/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:40.232 [60/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:40.232 [61/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:40.232 [62/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:40.232 [63/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:40.232 [64/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:40.232 [65/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:40.232 [66/265] Linking static target lib/librte_timer.a 00:02:40.232 [67/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:40.232 [68/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:40.232 [69/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:40.233 [70/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:40.233 [71/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:40.233 [72/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:40.233 [73/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:40.233 [74/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:40.233 [75/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:40.233 [76/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:40.233 [77/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:40.233 [78/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:40.233 [79/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:40.233 [80/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:40.233 [81/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:40.233 [82/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:40.233 [83/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:40.233 [84/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:40.233 [85/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:40.233 [86/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:40.233 [87/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:40.233 [88/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:40.233 [89/265] Linking static target lib/librte_telemetry.a 00:02:40.233 [90/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:40.233 [91/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:40.233 [92/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:40.233 [93/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:40.233 [94/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:40.233 [95/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:40.233 [96/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:40.233 [97/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:40.233 [98/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:40.233 [99/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:40.233 [100/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:40.233 [101/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:40.233 [102/265] Linking static target lib/librte_ring.a 00:02:40.233 [103/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:40.233 [104/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:40.233 [105/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:40.233 [106/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:40.233 [107/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:40.233 [108/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:40.233 [109/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:40.233 [110/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:40.233 [111/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:40.497 [112/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:40.497 [113/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:40.497 [114/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:40.497 [115/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:40.497 [116/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:40.497 [117/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:40.497 [118/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:40.497 [119/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:40.497 [120/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:40.497 [121/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:40.497 [122/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:40.497 [123/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:40.497 [124/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:40.497 [125/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:40.497 [126/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:40.497 [127/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:40.497 [128/265] Linking static target lib/librte_cmdline.a 00:02:40.497 [129/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:40.497 [130/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:40.497 [131/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:40.497 [132/265] Linking static target lib/librte_dmadev.a 00:02:40.497 [133/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:40.497 [134/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:40.497 [135/265] Linking static target lib/librte_mempool.a 00:02:40.497 [136/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:40.497 [137/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:40.551 [138/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:40.551 [139/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:40.551 [140/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:40.551 [141/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:40.551 [142/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:40.551 [143/265] Linking target lib/librte_log.so.24.0 00:02:40.551 [144/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:40.551 [145/265] Linking static target lib/librte_rcu.a 00:02:40.551 [146/265] Linking static target lib/librte_net.a 00:02:40.551 [147/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:40.551 [148/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:40.551 [149/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:40.551 [150/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:40.551 [151/265] Linking static target lib/librte_reorder.a 00:02:40.551 [152/265] Linking static target lib/librte_power.a 00:02:40.551 [153/265] Linking static target lib/librte_compressdev.a 00:02:40.551 [154/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:40.551 [155/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:40.551 [156/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:40.551 [157/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:40.551 [158/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:40.551 [159/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:40.551 [160/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:40.551 [161/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:40.551 [162/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:02:40.551 [163/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:40.551 [164/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:40.551 [165/265] Linking static target lib/librte_hash.a 00:02:40.813 [166/265] Linking static target lib/librte_security.a 00:02:40.813 [167/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:40.813 [168/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:40.813 [169/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:40.813 [170/265] Linking target lib/librte_kvargs.so.24.0 00:02:40.813 [171/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:40.813 [172/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:40.813 [173/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:40.813 [174/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:40.813 [175/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:40.813 [176/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:40.813 [177/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:40.813 [178/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:40.813 [179/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:40.813 [180/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:40.813 [181/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:40.813 [182/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:40.813 [183/265] Linking static target lib/librte_cryptodev.a 00:02:40.813 [184/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:40.813 [185/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:40.813 [186/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:40.813 [187/265] Linking static target lib/librte_mbuf.a 00:02:40.813 [188/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:02:40.813 [189/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:40.813 [190/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:40.813 [191/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:40.813 [192/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:40.813 [193/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:40.813 [194/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.077 [195/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:41.077 [196/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:41.077 [197/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.077 [198/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.077 [199/265] Linking static target lib/librte_eal.a 00:02:41.077 [200/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:41.078 [201/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:41.078 [202/265] Linking target lib/librte_telemetry.so.24.0 00:02:41.078 [203/265] Linking static target drivers/librte_bus_vdev.a 00:02:41.078 [204/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:41.078 [205/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:41.078 [206/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:41.078 [207/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:41.078 [208/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:41.078 [209/265] Linking static target drivers/librte_bus_pci.a 00:02:41.078 [210/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:41.078 [211/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:41.078 [212/265] Linking static target lib/librte_ethdev.a 00:02:41.078 [213/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:02:41.078 [214/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:41.078 [215/265] Linking static target drivers/librte_mempool_ring.a 00:02:41.337 [216/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.337 [217/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.337 [218/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.597 [219/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.597 [220/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.597 [221/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.857 [222/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.857 [223/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.857 [224/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:41.857 [225/265] Linking static target lib/librte_vhost.a 00:02:41.857 [226/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:43.237 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:44.177 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:50.755 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:54.046 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:54.046 [231/265] Linking target lib/librte_eal.so.24.0 00:02:54.306 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:54.306 [233/265] Linking target lib/librte_meter.so.24.0 00:02:54.306 [234/265] Linking target drivers/librte_bus_vdev.so.24.0 00:02:54.306 [235/265] Linking target lib/librte_ring.so.24.0 00:02:54.306 [236/265] Linking target lib/librte_pci.so.24.0 00:02:54.306 [237/265] Linking target lib/librte_dmadev.so.24.0 00:02:54.306 [238/265] Linking target lib/librte_timer.so.24.0 00:02:54.565 [239/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:54.565 [240/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:54.565 [241/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:54.565 [242/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:54.565 [243/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:54.565 [244/265] Linking target lib/librte_mempool.so.24.0 00:02:54.565 [245/265] Linking target lib/librte_rcu.so.24.0 00:02:54.565 [246/265] Linking target drivers/librte_bus_pci.so.24.0 00:02:54.565 [247/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:54.565 [248/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:54.824 [249/265] Linking target lib/librte_mbuf.so.24.0 00:02:54.824 [250/265] Linking target drivers/librte_mempool_ring.so.24.0 00:02:54.824 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:55.083 [252/265] Linking target lib/librte_reorder.so.24.0 00:02:55.083 [253/265] Linking target lib/librte_compressdev.so.24.0 00:02:55.083 [254/265] Linking target lib/librte_cryptodev.so.24.0 00:02:55.083 [255/265] Linking target lib/librte_net.so.24.0 00:02:55.083 [256/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:55.083 [257/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:55.343 [258/265] Linking target lib/librte_hash.so.24.0 00:02:55.343 [259/265] Linking target lib/librte_ethdev.so.24.0 00:02:55.343 [260/265] Linking target lib/librte_security.so.24.0 00:02:55.343 [261/265] Linking target lib/librte_cmdline.so.24.0 00:02:55.343 [262/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:55.343 [263/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:55.602 [264/265] Linking target lib/librte_power.so.24.0 00:02:55.602 [265/265] Linking target lib/librte_vhost.so.24.0 00:02:55.602 INFO: autodetecting backend as ninja 00:02:55.602 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 112 00:02:56.541 CC lib/log/log.o 00:02:56.541 CC lib/log/log_deprecated.o 00:02:56.541 CC lib/log/log_flags.o 00:02:56.541 CC lib/ut_mock/mock.o 00:02:56.541 CC lib/ut/ut.o 00:02:56.541 LIB libspdk_ut_mock.a 00:02:56.541 LIB libspdk_ut.a 00:02:56.801 LIB libspdk_log.a 00:02:57.061 CC lib/util/base64.o 00:02:57.061 CXX lib/trace_parser/trace.o 00:02:57.061 CC lib/util/bit_array.o 00:02:57.061 CC lib/util/cpuset.o 00:02:57.061 CC lib/util/crc32.o 00:02:57.061 CC lib/util/crc16.o 00:02:57.061 CC lib/util/crc32c.o 00:02:57.061 CC lib/util/crc32_ieee.o 00:02:57.061 CC lib/util/crc64.o 00:02:57.061 CC lib/util/dif.o 00:02:57.061 CC lib/ioat/ioat.o 00:02:57.061 CC lib/util/hexlify.o 00:02:57.061 CC lib/util/fd.o 00:02:57.061 CC lib/util/file.o 00:02:57.061 CC lib/util/iov.o 00:02:57.061 CC lib/dma/dma.o 00:02:57.061 CC lib/util/math.o 00:02:57.061 CC lib/util/pipe.o 00:02:57.061 CC lib/util/strerror_tls.o 00:02:57.061 CC lib/util/string.o 00:02:57.061 CC lib/util/uuid.o 00:02:57.061 CC lib/util/fd_group.o 00:02:57.061 CC lib/util/zipf.o 00:02:57.061 CC lib/util/xor.o 00:02:57.321 CC lib/vfio_user/host/vfio_user_pci.o 00:02:57.321 CC lib/vfio_user/host/vfio_user.o 00:02:57.321 LIB libspdk_dma.a 00:02:57.321 LIB libspdk_ioat.a 00:02:57.321 LIB libspdk_vfio_user.a 00:02:57.580 LIB libspdk_util.a 00:02:57.580 LIB libspdk_trace_parser.a 00:02:57.839 CC lib/env_dpdk/env.o 00:02:57.839 CC lib/conf/conf.o 00:02:57.839 CC lib/env_dpdk/memory.o 00:02:57.839 CC lib/env_dpdk/pci.o 00:02:57.839 CC lib/env_dpdk/threads.o 00:02:57.839 CC lib/env_dpdk/init.o 00:02:57.839 CC lib/env_dpdk/pci_ioat.o 00:02:57.839 CC lib/env_dpdk/pci_virtio.o 00:02:57.839 CC lib/env_dpdk/pci_event.o 00:02:57.839 CC lib/vmd/vmd.o 00:02:57.839 CC lib/env_dpdk/pci_vmd.o 00:02:57.839 CC lib/vmd/led.o 00:02:57.839 CC lib/env_dpdk/sigbus_handler.o 00:02:57.839 CC lib/env_dpdk/pci_idxd.o 00:02:57.839 CC lib/env_dpdk/pci_dpdk.o 00:02:57.839 CC lib/rdma/common.o 00:02:57.839 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:57.839 CC lib/rdma/rdma_verbs.o 00:02:57.839 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:57.839 CC lib/json/json_util.o 00:02:57.839 CC lib/json/json_parse.o 00:02:57.840 CC lib/idxd/idxd.o 00:02:57.840 CC lib/idxd/idxd_user.o 00:02:57.840 CC lib/json/json_write.o 00:02:57.840 CC lib/idxd/idxd_kernel.o 00:02:58.139 LIB libspdk_conf.a 00:02:58.139 LIB libspdk_rdma.a 00:02:58.139 LIB libspdk_json.a 00:02:58.139 LIB libspdk_idxd.a 00:02:58.399 LIB libspdk_vmd.a 00:02:58.399 CC lib/jsonrpc/jsonrpc_server.o 00:02:58.399 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:58.399 CC lib/jsonrpc/jsonrpc_client.o 00:02:58.399 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:58.659 LIB libspdk_jsonrpc.a 00:02:58.922 CC lib/rpc/rpc.o 00:02:58.922 LIB libspdk_env_dpdk.a 00:02:59.182 LIB libspdk_rpc.a 00:02:59.441 CC lib/notify/notify.o 00:02:59.441 CC lib/notify/notify_rpc.o 00:02:59.441 CC lib/trace/trace.o 00:02:59.441 CC lib/trace/trace_flags.o 00:02:59.441 CC lib/trace/trace_rpc.o 00:02:59.441 CC lib/sock/sock.o 00:02:59.441 CC lib/sock/sock_rpc.o 00:02:59.700 LIB libspdk_notify.a 00:02:59.700 LIB libspdk_trace.a 00:02:59.700 LIB libspdk_sock.a 00:02:59.960 CC lib/thread/thread.o 00:02:59.960 CC lib/thread/iobuf.o 00:03:00.219 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:00.219 CC lib/nvme/nvme_ctrlr.o 00:03:00.219 CC lib/nvme/nvme_fabric.o 00:03:00.219 CC lib/nvme/nvme_ns_cmd.o 00:03:00.219 CC lib/nvme/nvme_ns.o 00:03:00.219 CC lib/nvme/nvme_pcie_common.o 00:03:00.219 CC lib/nvme/nvme_pcie.o 00:03:00.219 CC lib/nvme/nvme_qpair.o 00:03:00.219 CC lib/nvme/nvme.o 00:03:00.220 CC lib/nvme/nvme_discovery.o 00:03:00.220 CC lib/nvme/nvme_quirks.o 00:03:00.220 CC lib/nvme/nvme_transport.o 00:03:00.220 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:00.220 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:00.220 CC lib/nvme/nvme_tcp.o 00:03:00.220 CC lib/nvme/nvme_opal.o 00:03:00.220 CC lib/nvme/nvme_io_msg.o 00:03:00.220 CC lib/nvme/nvme_poll_group.o 00:03:00.220 CC lib/nvme/nvme_zns.o 00:03:00.220 CC lib/nvme/nvme_cuse.o 00:03:00.220 CC lib/nvme/nvme_vfio_user.o 00:03:00.220 CC lib/nvme/nvme_rdma.o 00:03:01.157 LIB libspdk_thread.a 00:03:01.416 CC lib/accel/accel.o 00:03:01.416 CC lib/init/json_config.o 00:03:01.416 CC lib/accel/accel_rpc.o 00:03:01.416 CC lib/init/subsystem.o 00:03:01.416 CC lib/accel/accel_sw.o 00:03:01.416 CC lib/init/subsystem_rpc.o 00:03:01.416 CC lib/init/rpc.o 00:03:01.416 CC lib/virtio/virtio.o 00:03:01.416 CC lib/virtio/virtio_vhost_user.o 00:03:01.416 CC lib/virtio/virtio_vfio_user.o 00:03:01.416 CC lib/virtio/virtio_pci.o 00:03:01.416 CC lib/vfu_tgt/tgt_endpoint.o 00:03:01.416 CC lib/vfu_tgt/tgt_rpc.o 00:03:01.416 CC lib/blob/blobstore.o 00:03:01.416 CC lib/blob/request.o 00:03:01.416 CC lib/blob/zeroes.o 00:03:01.416 CC lib/blob/blob_bs_dev.o 00:03:01.416 LIB libspdk_init.a 00:03:01.676 LIB libspdk_virtio.a 00:03:01.676 LIB libspdk_vfu_tgt.a 00:03:01.676 LIB libspdk_nvme.a 00:03:01.935 CC lib/event/app.o 00:03:01.935 CC lib/event/reactor.o 00:03:01.935 CC lib/event/log_rpc.o 00:03:01.935 CC lib/event/app_rpc.o 00:03:01.935 CC lib/event/scheduler_static.o 00:03:02.194 LIB libspdk_event.a 00:03:02.194 LIB libspdk_accel.a 00:03:02.453 CC lib/bdev/bdev.o 00:03:02.453 CC lib/bdev/bdev_rpc.o 00:03:02.453 CC lib/bdev/bdev_zone.o 00:03:02.453 CC lib/bdev/part.o 00:03:02.453 CC lib/bdev/scsi_nvme.o 00:03:03.390 LIB libspdk_blob.a 00:03:03.650 CC lib/blobfs/blobfs.o 00:03:03.650 CC lib/blobfs/tree.o 00:03:03.650 CC lib/lvol/lvol.o 00:03:04.219 LIB libspdk_lvol.a 00:03:04.788 LIB libspdk_bdev.a 00:03:05.047 CC lib/scsi/dev.o 00:03:05.048 CC lib/scsi/lun.o 00:03:05.048 CC lib/scsi/port.o 00:03:05.048 CC lib/scsi/scsi.o 00:03:05.048 CC lib/scsi/scsi_pr.o 00:03:05.048 CC lib/scsi/scsi_bdev.o 00:03:05.048 CC lib/scsi/scsi_rpc.o 00:03:05.048 CC lib/nvmf/ctrlr.o 00:03:05.048 CC lib/scsi/task.o 00:03:05.048 CC lib/nvmf/ctrlr_discovery.o 00:03:05.048 CC lib/nvmf/ctrlr_bdev.o 00:03:05.048 CC lib/nvmf/nvmf.o 00:03:05.048 CC lib/nvmf/subsystem.o 00:03:05.048 CC lib/nvmf/nvmf_rpc.o 00:03:05.048 CC lib/nvmf/transport.o 00:03:05.048 CC lib/nvmf/tcp.o 00:03:05.048 CC lib/nvmf/vfio_user.o 00:03:05.048 CC lib/nvmf/rdma.o 00:03:05.048 CC lib/nbd/nbd.o 00:03:05.048 CC lib/nbd/nbd_rpc.o 00:03:05.048 CC lib/ftl/ftl_core.o 00:03:05.048 CC lib/ublk/ublk.o 00:03:05.048 CC lib/ftl/ftl_init.o 00:03:05.048 CC lib/ublk/ublk_rpc.o 00:03:05.048 CC lib/ftl/ftl_layout.o 00:03:05.048 CC lib/ftl/ftl_debug.o 00:03:05.048 CC lib/ftl/ftl_io.o 00:03:05.048 CC lib/ftl/ftl_sb.o 00:03:05.048 CC lib/ftl/ftl_l2p.o 00:03:05.048 CC lib/ftl/ftl_band.o 00:03:05.048 CC lib/ftl/ftl_l2p_flat.o 00:03:05.048 CC lib/ftl/ftl_nv_cache.o 00:03:05.048 CC lib/ftl/ftl_writer.o 00:03:05.048 CC lib/ftl/ftl_band_ops.o 00:03:05.048 CC lib/ftl/ftl_rq.o 00:03:05.048 CC lib/ftl/ftl_reloc.o 00:03:05.048 CC lib/ftl/ftl_l2p_cache.o 00:03:05.048 CC lib/ftl/ftl_p2l.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:05.048 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:05.048 CC lib/ftl/utils/ftl_conf.o 00:03:05.048 CC lib/ftl/utils/ftl_md.o 00:03:05.048 CC lib/ftl/utils/ftl_mempool.o 00:03:05.048 CC lib/ftl/utils/ftl_property.o 00:03:05.048 CC lib/ftl/utils/ftl_bitmap.o 00:03:05.048 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:05.048 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:05.048 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:05.048 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:05.048 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:05.048 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:05.048 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:05.048 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:05.048 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:05.048 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:05.048 CC lib/ftl/base/ftl_base_dev.o 00:03:05.048 CC lib/ftl/base/ftl_base_bdev.o 00:03:05.048 CC lib/ftl/ftl_trace.o 00:03:05.307 LIB libspdk_blobfs.a 00:03:05.565 LIB libspdk_nbd.a 00:03:05.565 LIB libspdk_scsi.a 00:03:05.565 LIB libspdk_ublk.a 00:03:05.824 LIB libspdk_ftl.a 00:03:05.824 CC lib/iscsi/conn.o 00:03:05.824 CC lib/iscsi/init_grp.o 00:03:05.824 CC lib/iscsi/iscsi.o 00:03:05.824 CC lib/iscsi/md5.o 00:03:05.824 CC lib/iscsi/param.o 00:03:05.824 CC lib/iscsi/portal_grp.o 00:03:05.824 CC lib/vhost/vhost.o 00:03:05.824 CC lib/iscsi/tgt_node.o 00:03:05.824 CC lib/vhost/vhost_rpc.o 00:03:05.824 CC lib/iscsi/iscsi_subsystem.o 00:03:05.824 CC lib/vhost/vhost_scsi.o 00:03:05.824 CC lib/iscsi/iscsi_rpc.o 00:03:05.824 CC lib/vhost/vhost_blk.o 00:03:05.824 CC lib/iscsi/task.o 00:03:05.824 CC lib/vhost/rte_vhost_user.o 00:03:06.760 LIB libspdk_nvmf.a 00:03:06.760 LIB libspdk_vhost.a 00:03:07.020 LIB libspdk_iscsi.a 00:03:07.279 CC module/env_dpdk/env_dpdk_rpc.o 00:03:07.279 CC module/vfu_device/vfu_virtio_blk.o 00:03:07.279 CC module/vfu_device/vfu_virtio.o 00:03:07.279 CC module/vfu_device/vfu_virtio_scsi.o 00:03:07.279 CC module/vfu_device/vfu_virtio_rpc.o 00:03:07.538 LIB libspdk_env_dpdk_rpc.a 00:03:07.538 CC module/blob/bdev/blob_bdev.o 00:03:07.538 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:07.538 CC module/accel/ioat/accel_ioat.o 00:03:07.538 CC module/accel/ioat/accel_ioat_rpc.o 00:03:07.538 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:07.538 CC module/scheduler/gscheduler/gscheduler.o 00:03:07.538 CC module/accel/error/accel_error.o 00:03:07.538 CC module/accel/error/accel_error_rpc.o 00:03:07.538 CC module/accel/iaa/accel_iaa.o 00:03:07.538 CC module/accel/iaa/accel_iaa_rpc.o 00:03:07.538 CC module/sock/posix/posix.o 00:03:07.538 CC module/accel/dsa/accel_dsa.o 00:03:07.538 CC module/accel/dsa/accel_dsa_rpc.o 00:03:07.538 LIB libspdk_scheduler_gscheduler.a 00:03:07.538 LIB libspdk_scheduler_dpdk_governor.a 00:03:07.797 LIB libspdk_accel_error.a 00:03:07.797 LIB libspdk_scheduler_dynamic.a 00:03:07.797 LIB libspdk_accel_ioat.a 00:03:07.797 LIB libspdk_accel_iaa.a 00:03:07.797 LIB libspdk_blob_bdev.a 00:03:07.797 LIB libspdk_accel_dsa.a 00:03:07.797 LIB libspdk_vfu_device.a 00:03:08.056 LIB libspdk_sock_posix.a 00:03:08.056 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:08.056 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:08.056 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:08.315 CC module/bdev/passthru/vbdev_passthru.o 00:03:08.315 CC module/bdev/delay/vbdev_delay.o 00:03:08.315 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:08.315 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:08.315 CC module/bdev/gpt/gpt.o 00:03:08.315 CC module/bdev/gpt/vbdev_gpt.o 00:03:08.315 CC module/bdev/aio/bdev_aio.o 00:03:08.315 CC module/bdev/aio/bdev_aio_rpc.o 00:03:08.315 CC module/bdev/split/vbdev_split_rpc.o 00:03:08.315 CC module/bdev/split/vbdev_split.o 00:03:08.315 CC module/bdev/malloc/bdev_malloc.o 00:03:08.315 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:08.315 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:08.315 CC module/bdev/ftl/bdev_ftl.o 00:03:08.315 CC module/blobfs/bdev/blobfs_bdev.o 00:03:08.315 CC module/bdev/lvol/vbdev_lvol.o 00:03:08.315 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:08.315 CC module/bdev/null/bdev_null.o 00:03:08.315 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:08.315 CC module/bdev/null/bdev_null_rpc.o 00:03:08.315 CC module/bdev/error/vbdev_error.o 00:03:08.315 CC module/bdev/error/vbdev_error_rpc.o 00:03:08.315 CC module/bdev/iscsi/bdev_iscsi.o 00:03:08.315 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:08.315 CC module/bdev/nvme/bdev_nvme.o 00:03:08.315 CC module/bdev/raid/bdev_raid.o 00:03:08.315 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:08.315 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:08.315 CC module/bdev/raid/bdev_raid_rpc.o 00:03:08.315 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:08.315 CC module/bdev/raid/bdev_raid_sb.o 00:03:08.315 CC module/bdev/nvme/nvme_rpc.o 00:03:08.315 CC module/bdev/nvme/bdev_mdns_client.o 00:03:08.315 CC module/bdev/raid/raid0.o 00:03:08.315 CC module/bdev/nvme/vbdev_opal.o 00:03:08.315 CC module/bdev/raid/raid1.o 00:03:08.315 CC module/bdev/raid/concat.o 00:03:08.315 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:08.315 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:08.315 LIB libspdk_blobfs_bdev.a 00:03:08.315 LIB libspdk_bdev_split.a 00:03:08.315 LIB libspdk_bdev_error.a 00:03:08.575 LIB libspdk_bdev_gpt.a 00:03:08.575 LIB libspdk_bdev_null.a 00:03:08.575 LIB libspdk_bdev_ftl.a 00:03:08.575 LIB libspdk_bdev_passthru.a 00:03:08.575 LIB libspdk_bdev_aio.a 00:03:08.575 LIB libspdk_bdev_zone_block.a 00:03:08.575 LIB libspdk_bdev_iscsi.a 00:03:08.575 LIB libspdk_bdev_delay.a 00:03:08.575 LIB libspdk_bdev_malloc.a 00:03:08.575 LIB libspdk_bdev_virtio.a 00:03:08.833 LIB libspdk_bdev_raid.a 00:03:08.833 LIB libspdk_bdev_lvol.a 00:03:09.770 LIB libspdk_bdev_nvme.a 00:03:10.707 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:10.707 CC module/event/subsystems/scheduler/scheduler.o 00:03:10.707 CC module/event/subsystems/iobuf/iobuf.o 00:03:10.707 CC module/event/subsystems/sock/sock.o 00:03:10.707 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:10.707 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:10.707 CC module/event/subsystems/vmd/vmd.o 00:03:10.707 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:10.707 LIB libspdk_event_sock.a 00:03:10.707 LIB libspdk_event_scheduler.a 00:03:10.707 LIB libspdk_event_iobuf.a 00:03:10.707 LIB libspdk_event_vhost_blk.a 00:03:10.707 LIB libspdk_event_vfu_tgt.a 00:03:10.707 LIB libspdk_event_vmd.a 00:03:10.966 CC module/event/subsystems/accel/accel.o 00:03:10.966 LIB libspdk_event_accel.a 00:03:11.558 CC module/event/subsystems/bdev/bdev.o 00:03:11.558 LIB libspdk_event_bdev.a 00:03:11.816 CC module/event/subsystems/nbd/nbd.o 00:03:11.816 CC module/event/subsystems/scsi/scsi.o 00:03:11.816 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:11.816 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:11.816 CC module/event/subsystems/ublk/ublk.o 00:03:12.075 LIB libspdk_event_nbd.a 00:03:12.075 LIB libspdk_event_scsi.a 00:03:12.075 LIB libspdk_event_ublk.a 00:03:12.075 LIB libspdk_event_nvmf.a 00:03:12.338 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:12.338 CC module/event/subsystems/iscsi/iscsi.o 00:03:12.597 LIB libspdk_event_vhost_scsi.a 00:03:12.597 LIB libspdk_event_iscsi.a 00:03:12.856 CC test/rpc_client/rpc_client_test.o 00:03:12.856 TEST_HEADER include/spdk/accel_module.h 00:03:12.856 TEST_HEADER include/spdk/accel.h 00:03:12.856 TEST_HEADER include/spdk/barrier.h 00:03:12.856 TEST_HEADER include/spdk/base64.h 00:03:12.856 TEST_HEADER include/spdk/bdev.h 00:03:12.856 TEST_HEADER include/spdk/assert.h 00:03:12.856 TEST_HEADER include/spdk/bdev_zone.h 00:03:12.856 TEST_HEADER include/spdk/bit_array.h 00:03:12.856 TEST_HEADER include/spdk/bdev_module.h 00:03:12.856 TEST_HEADER include/spdk/blob_bdev.h 00:03:12.856 TEST_HEADER include/spdk/bit_pool.h 00:03:12.856 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:12.856 TEST_HEADER include/spdk/blob.h 00:03:12.856 TEST_HEADER include/spdk/blobfs.h 00:03:12.856 TEST_HEADER include/spdk/conf.h 00:03:12.856 TEST_HEADER include/spdk/config.h 00:03:12.856 TEST_HEADER include/spdk/cpuset.h 00:03:12.856 TEST_HEADER include/spdk/crc32.h 00:03:12.856 TEST_HEADER include/spdk/crc16.h 00:03:12.856 TEST_HEADER include/spdk/crc64.h 00:03:12.856 TEST_HEADER include/spdk/dif.h 00:03:12.856 TEST_HEADER include/spdk/dma.h 00:03:12.856 CXX app/trace/trace.o 00:03:12.856 TEST_HEADER include/spdk/endian.h 00:03:12.856 TEST_HEADER include/spdk/env.h 00:03:12.856 TEST_HEADER include/spdk/env_dpdk.h 00:03:12.856 TEST_HEADER include/spdk/event.h 00:03:12.856 TEST_HEADER include/spdk/fd_group.h 00:03:12.856 TEST_HEADER include/spdk/fd.h 00:03:12.856 TEST_HEADER include/spdk/file.h 00:03:12.856 TEST_HEADER include/spdk/ftl.h 00:03:12.856 TEST_HEADER include/spdk/gpt_spec.h 00:03:12.856 TEST_HEADER include/spdk/hexlify.h 00:03:12.856 TEST_HEADER include/spdk/histogram_data.h 00:03:12.856 CC app/trace_record/trace_record.o 00:03:12.856 CC app/spdk_top/spdk_top.o 00:03:12.856 TEST_HEADER include/spdk/idxd.h 00:03:12.856 TEST_HEADER include/spdk/idxd_spec.h 00:03:12.856 TEST_HEADER include/spdk/ioat.h 00:03:12.856 TEST_HEADER include/spdk/init.h 00:03:12.856 TEST_HEADER include/spdk/iscsi_spec.h 00:03:12.856 CC app/spdk_nvme_identify/identify.o 00:03:12.856 TEST_HEADER include/spdk/json.h 00:03:12.856 TEST_HEADER include/spdk/ioat_spec.h 00:03:12.856 TEST_HEADER include/spdk/jsonrpc.h 00:03:12.856 CC app/spdk_nvme_perf/perf.o 00:03:12.856 TEST_HEADER include/spdk/log.h 00:03:12.856 TEST_HEADER include/spdk/likely.h 00:03:12.856 TEST_HEADER include/spdk/lvol.h 00:03:12.856 TEST_HEADER include/spdk/memory.h 00:03:12.856 TEST_HEADER include/spdk/mmio.h 00:03:12.856 TEST_HEADER include/spdk/nbd.h 00:03:12.856 TEST_HEADER include/spdk/notify.h 00:03:12.856 TEST_HEADER include/spdk/nvme.h 00:03:12.856 TEST_HEADER include/spdk/nvme_intel.h 00:03:12.856 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:12.856 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:12.856 TEST_HEADER include/spdk/nvme_spec.h 00:03:12.856 TEST_HEADER include/spdk/nvme_zns.h 00:03:12.856 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:12.856 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:12.856 TEST_HEADER include/spdk/nvmf.h 00:03:12.856 CC app/spdk_lspci/spdk_lspci.o 00:03:12.856 TEST_HEADER include/spdk/nvmf_spec.h 00:03:12.856 TEST_HEADER include/spdk/nvmf_transport.h 00:03:12.856 TEST_HEADER include/spdk/opal.h 00:03:12.856 TEST_HEADER include/spdk/pci_ids.h 00:03:12.856 TEST_HEADER include/spdk/opal_spec.h 00:03:12.856 CC app/spdk_nvme_discover/discovery_aer.o 00:03:12.856 TEST_HEADER include/spdk/pipe.h 00:03:12.856 TEST_HEADER include/spdk/queue.h 00:03:12.856 TEST_HEADER include/spdk/reduce.h 00:03:12.856 TEST_HEADER include/spdk/rpc.h 00:03:12.856 TEST_HEADER include/spdk/scheduler.h 00:03:12.856 TEST_HEADER include/spdk/scsi.h 00:03:12.856 TEST_HEADER include/spdk/scsi_spec.h 00:03:12.856 TEST_HEADER include/spdk/sock.h 00:03:12.856 TEST_HEADER include/spdk/stdinc.h 00:03:12.856 TEST_HEADER include/spdk/string.h 00:03:12.856 TEST_HEADER include/spdk/thread.h 00:03:12.856 TEST_HEADER include/spdk/trace.h 00:03:12.856 TEST_HEADER include/spdk/trace_parser.h 00:03:12.856 TEST_HEADER include/spdk/ublk.h 00:03:12.856 TEST_HEADER include/spdk/tree.h 00:03:12.856 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:12.856 TEST_HEADER include/spdk/util.h 00:03:12.856 TEST_HEADER include/spdk/uuid.h 00:03:12.856 TEST_HEADER include/spdk/version.h 00:03:12.856 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:12.856 TEST_HEADER include/spdk/vhost.h 00:03:12.856 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:12.856 CC app/vhost/vhost.o 00:03:12.856 TEST_HEADER include/spdk/vmd.h 00:03:12.856 TEST_HEADER include/spdk/xor.h 00:03:12.856 TEST_HEADER include/spdk/zipf.h 00:03:12.856 CXX test/cpp_headers/accel.o 00:03:12.856 CXX test/cpp_headers/accel_module.o 00:03:12.856 CC app/nvmf_tgt/nvmf_main.o 00:03:12.856 CXX test/cpp_headers/assert.o 00:03:12.856 CXX test/cpp_headers/barrier.o 00:03:12.856 CXX test/cpp_headers/bdev.o 00:03:12.856 CXX test/cpp_headers/base64.o 00:03:12.856 CXX test/cpp_headers/bdev_module.o 00:03:12.856 CC app/spdk_dd/spdk_dd.o 00:03:12.856 CXX test/cpp_headers/bdev_zone.o 00:03:12.856 CXX test/cpp_headers/bit_pool.o 00:03:12.856 CXX test/cpp_headers/bit_array.o 00:03:12.856 CXX test/cpp_headers/blob_bdev.o 00:03:12.856 CXX test/cpp_headers/blobfs.o 00:03:12.856 CXX test/cpp_headers/blobfs_bdev.o 00:03:12.856 CXX test/cpp_headers/blob.o 00:03:12.856 CXX test/cpp_headers/conf.o 00:03:12.856 CXX test/cpp_headers/config.o 00:03:12.856 CXX test/cpp_headers/cpuset.o 00:03:12.856 CXX test/cpp_headers/crc16.o 00:03:13.123 CXX test/cpp_headers/crc64.o 00:03:13.123 CXX test/cpp_headers/crc32.o 00:03:13.123 CXX test/cpp_headers/dif.o 00:03:13.123 CXX test/cpp_headers/dma.o 00:03:13.123 CXX test/cpp_headers/endian.o 00:03:13.123 CXX test/cpp_headers/env_dpdk.o 00:03:13.123 CXX test/cpp_headers/env.o 00:03:13.123 CXX test/cpp_headers/event.o 00:03:13.123 CXX test/cpp_headers/fd_group.o 00:03:13.123 CXX test/cpp_headers/fd.o 00:03:13.123 CXX test/cpp_headers/file.o 00:03:13.123 CXX test/cpp_headers/gpt_spec.o 00:03:13.123 CXX test/cpp_headers/ftl.o 00:03:13.123 CXX test/cpp_headers/hexlify.o 00:03:13.123 CC app/iscsi_tgt/iscsi_tgt.o 00:03:13.123 CXX test/cpp_headers/histogram_data.o 00:03:13.123 CXX test/cpp_headers/idxd.o 00:03:13.123 CXX test/cpp_headers/idxd_spec.o 00:03:13.123 CXX test/cpp_headers/init.o 00:03:13.123 CC test/event/reactor/reactor.o 00:03:13.123 CC app/spdk_tgt/spdk_tgt.o 00:03:13.123 CC test/thread/lock/spdk_lock.o 00:03:13.123 CC test/app/histogram_perf/histogram_perf.o 00:03:13.123 CC test/nvme/reset/reset.o 00:03:13.123 CC test/event/event_perf/event_perf.o 00:03:13.123 CC test/nvme/err_injection/err_injection.o 00:03:13.123 CC test/nvme/connect_stress/connect_stress.o 00:03:13.123 CC test/nvme/aer/aer.o 00:03:13.123 CC test/nvme/overhead/overhead.o 00:03:13.123 CC test/thread/poller_perf/poller_perf.o 00:03:13.123 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:13.123 CC test/app/jsoncat/jsoncat.o 00:03:13.123 CC test/nvme/simple_copy/simple_copy.o 00:03:13.123 CC test/nvme/cuse/cuse.o 00:03:13.123 CC test/nvme/startup/startup.o 00:03:13.123 CC test/nvme/reserve/reserve.o 00:03:13.123 CC test/nvme/e2edp/nvme_dp.o 00:03:13.123 CC test/nvme/compliance/nvme_compliance.o 00:03:13.123 CC test/nvme/sgl/sgl.o 00:03:13.123 CC test/nvme/fused_ordering/fused_ordering.o 00:03:13.123 CC test/nvme/boot_partition/boot_partition.o 00:03:13.123 CC test/env/memory/memory_ut.o 00:03:13.123 CC test/app/stub/stub.o 00:03:13.123 CC test/event/reactor_perf/reactor_perf.o 00:03:13.123 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:13.123 CC examples/nvme/reconnect/reconnect.o 00:03:13.123 CC examples/ioat/perf/perf.o 00:03:13.123 CC test/env/pci/pci_ut.o 00:03:13.123 CC examples/ioat/verify/verify.o 00:03:13.123 CC test/env/vtophys/vtophys.o 00:03:13.123 CC examples/vmd/lsvmd/lsvmd.o 00:03:13.123 CC examples/util/zipf/zipf.o 00:03:13.123 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:13.123 CC examples/nvme/abort/abort.o 00:03:13.123 CC examples/nvme/hello_world/hello_world.o 00:03:13.123 CC examples/nvme/arbitration/arbitration.o 00:03:13.123 CC examples/sock/hello_world/hello_sock.o 00:03:13.123 CC test/accel/dif/dif.o 00:03:13.123 CC test/dma/test_dma/test_dma.o 00:03:13.123 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:13.123 CC test/nvme/fdp/fdp.o 00:03:13.123 CC app/fio/nvme/fio_plugin.o 00:03:13.123 CC test/bdev/bdevio/bdevio.o 00:03:13.123 CC examples/accel/perf/accel_perf.o 00:03:13.123 CC examples/idxd/perf/perf.o 00:03:13.123 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:13.123 CC examples/vmd/led/led.o 00:03:13.123 CC test/event/app_repeat/app_repeat.o 00:03:13.123 CC test/event/scheduler/scheduler.o 00:03:13.123 CC app/fio/bdev/fio_plugin.o 00:03:13.123 CC test/app/bdev_svc/bdev_svc.o 00:03:13.123 CC test/blobfs/mkfs/mkfs.o 00:03:13.123 CC examples/blob/cli/blobcli.o 00:03:13.123 LINK rpc_client_test 00:03:13.123 CC examples/nvme/hotplug/hotplug.o 00:03:13.123 CC examples/nvmf/nvmf/nvmf.o 00:03:13.123 CC examples/thread/thread/thread_ex.o 00:03:13.123 CC examples/bdev/hello_world/hello_bdev.o 00:03:13.123 LINK spdk_lspci 00:03:13.123 CC examples/blob/hello_world/hello_blob.o 00:03:13.123 CC examples/bdev/bdevperf/bdevperf.o 00:03:13.123 CC test/lvol/esnap/esnap.o 00:03:13.123 CC test/env/mem_callbacks/mem_callbacks.o 00:03:13.123 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:13.123 CXX test/cpp_headers/ioat.o 00:03:13.123 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:13.123 CXX test/cpp_headers/ioat_spec.o 00:03:13.391 LINK reactor 00:03:13.391 LINK spdk_nvme_discover 00:03:13.391 CXX test/cpp_headers/json.o 00:03:13.391 LINK histogram_perf 00:03:13.391 CXX test/cpp_headers/iscsi_spec.o 00:03:13.391 LINK nvmf_tgt 00:03:13.391 LINK interrupt_tgt 00:03:13.391 CXX test/cpp_headers/jsonrpc.o 00:03:13.391 CXX test/cpp_headers/log.o 00:03:13.391 CXX test/cpp_headers/likely.o 00:03:13.391 CXX test/cpp_headers/lvol.o 00:03:13.391 CXX test/cpp_headers/memory.o 00:03:13.391 CXX test/cpp_headers/mmio.o 00:03:13.391 CXX test/cpp_headers/nbd.o 00:03:13.391 CXX test/cpp_headers/nvme.o 00:03:13.391 CXX test/cpp_headers/nvme_intel.o 00:03:13.391 CXX test/cpp_headers/notify.o 00:03:13.391 CXX test/cpp_headers/nvme_ocssd.o 00:03:13.391 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:13.391 CXX test/cpp_headers/nvme_spec.o 00:03:13.391 CXX test/cpp_headers/nvmf_cmd.o 00:03:13.391 CXX test/cpp_headers/nvme_zns.o 00:03:13.391 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:13.391 CXX test/cpp_headers/nvmf.o 00:03:13.391 CXX test/cpp_headers/nvmf_spec.o 00:03:13.391 LINK spdk_trace_record 00:03:13.391 CXX test/cpp_headers/nvmf_transport.o 00:03:13.391 CXX test/cpp_headers/opal.o 00:03:13.391 CXX test/cpp_headers/opal_spec.o 00:03:13.391 LINK vhost 00:03:13.391 LINK jsoncat 00:03:13.391 CXX test/cpp_headers/pci_ids.o 00:03:13.391 LINK connect_stress 00:03:13.391 LINK event_perf 00:03:13.391 CXX test/cpp_headers/pipe.o 00:03:13.391 LINK reactor_perf 00:03:13.391 LINK vtophys 00:03:13.391 LINK poller_perf 00:03:13.391 LINK lsvmd 00:03:13.391 CXX test/cpp_headers/queue.o 00:03:13.391 CXX test/cpp_headers/reduce.o 00:03:13.391 CXX test/cpp_headers/rpc.o 00:03:13.391 LINK zipf 00:03:13.391 CXX test/cpp_headers/scheduler.o 00:03:13.391 CXX test/cpp_headers/scsi.o 00:03:13.391 CXX test/cpp_headers/scsi_spec.o 00:03:13.391 LINK env_dpdk_post_init 00:03:13.391 LINK led 00:03:13.391 LINK app_repeat 00:03:13.391 LINK startup 00:03:13.391 LINK iscsi_tgt 00:03:13.391 LINK stub 00:03:13.391 LINK err_injection 00:03:13.391 LINK boot_partition 00:03:13.391 LINK doorbell_aers 00:03:13.391 LINK reserve 00:03:13.391 LINK pmr_persistence 00:03:13.391 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:13.391 LINK fused_ordering 00:03:13.391 LINK spdk_tgt 00:03:13.391 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:13.391 LINK bdev_svc 00:03:13.391 LINK hello_world 00:03:13.391 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:13.391 LINK cmb_copy 00:03:13.391 LINK ioat_perf 00:03:13.391 LINK verify 00:03:13.391 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:13.391 LINK mkfs 00:03:13.391 CXX test/cpp_headers/sock.o 00:03:13.391 LINK simple_copy 00:03:13.391 LINK nvme_dp 00:03:13.391 LINK sgl 00:03:13.391 LINK hello_sock 00:03:13.391 LINK scheduler 00:03:13.391 LINK reset 00:03:13.391 LINK hello_blob 00:03:13.391 CXX test/cpp_headers/stdinc.o 00:03:13.391 LINK hotplug 00:03:13.653 LINK fdp 00:03:13.653 CXX test/cpp_headers/string.o 00:03:13.653 LINK spdk_trace 00:03:13.653 LINK overhead 00:03:13.653 LINK aer 00:03:13.653 CXX test/cpp_headers/thread.o 00:03:13.653 CXX test/cpp_headers/trace.o 00:03:13.653 CXX test/cpp_headers/trace_parser.o 00:03:13.653 CXX test/cpp_headers/tree.o 00:03:13.653 CXX test/cpp_headers/ublk.o 00:03:13.653 CXX test/cpp_headers/util.o 00:03:13.653 LINK thread 00:03:13.653 CXX test/cpp_headers/uuid.o 00:03:13.654 LINK nvmf 00:03:13.654 CXX test/cpp_headers/version.o 00:03:13.654 CXX test/cpp_headers/vfio_user_pci.o 00:03:13.654 CXX test/cpp_headers/vfio_user_spec.o 00:03:13.654 LINK hello_bdev 00:03:13.654 CXX test/cpp_headers/vhost.o 00:03:13.654 CXX test/cpp_headers/vmd.o 00:03:13.654 CXX test/cpp_headers/xor.o 00:03:13.654 CXX test/cpp_headers/zipf.o 00:03:13.654 LINK reconnect 00:03:13.654 LINK idxd_perf 00:03:13.654 LINK bdevio 00:03:13.654 LINK dif 00:03:13.654 LINK spdk_dd 00:03:13.654 LINK test_dma 00:03:13.654 LINK abort 00:03:13.654 LINK arbitration 00:03:13.654 LINK nvme_manage 00:03:13.927 LINK pci_ut 00:03:13.928 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:03:13.928 struct spdk_nvme_fdp_ruhs ruhs; 00:03:13.928 ^ 00:03:13.928 LINK nvme_compliance 00:03:13.928 LINK accel_perf 00:03:13.928 LINK nvme_fuzz 00:03:13.928 LINK blobcli 00:03:13.928 LINK llvm_vfio_fuzz 00:03:13.928 LINK spdk_nvme_perf 00:03:13.928 LINK spdk_nvme_identify 00:03:13.928 LINK vhost_fuzz 00:03:13.928 LINK spdk_bdev 00:03:14.448 LINK mem_callbacks 00:03:14.448 LINK spdk_top 00:03:14.448 LINK bdevperf 00:03:14.448 LINK llvm_nvme_fuzz 00:03:14.448 LINK memory_ut 00:03:14.448 LINK cuse 00:03:14.706 1 warning generated. 00:03:14.706 LINK spdk_nvme 00:03:14.964 LINK spdk_lock 00:03:15.222 LINK iscsi_fuzz 00:03:17.764 LINK esnap 00:03:18.024 00:03:18.024 real 0m48.561s 00:03:18.024 user 7m10.597s 00:03:18.024 sys 2m56.660s 00:03:18.024 05:02:48 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:18.024 05:02:48 -- common/autotest_common.sh@10 -- $ set +x 00:03:18.024 ************************************ 00:03:18.024 END TEST make 00:03:18.024 ************************************ 00:03:18.024 05:02:49 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:18.024 05:02:49 -- nvmf/common.sh@7 -- # uname -s 00:03:18.024 05:02:49 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:18.024 05:02:49 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:18.024 05:02:49 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:18.024 05:02:49 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:18.024 05:02:49 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:18.024 05:02:49 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:18.024 05:02:49 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:18.024 05:02:49 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:18.024 05:02:49 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:18.024 05:02:49 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:18.024 05:02:49 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:18.024 05:02:49 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:18.024 05:02:49 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:18.024 05:02:49 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:18.024 05:02:49 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:18.024 05:02:49 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:18.024 05:02:49 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:18.024 05:02:49 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:18.024 05:02:49 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:18.024 05:02:49 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:18.024 05:02:49 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:18.024 05:02:49 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:18.024 05:02:49 -- paths/export.sh@5 -- # export PATH 00:03:18.024 05:02:49 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:18.024 05:02:49 -- nvmf/common.sh@46 -- # : 0 00:03:18.024 05:02:49 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:18.024 05:02:49 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:18.024 05:02:49 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:18.024 05:02:49 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:18.024 05:02:49 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:18.024 05:02:49 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:18.024 05:02:49 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:18.024 05:02:49 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:18.024 05:02:49 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:18.024 05:02:49 -- spdk/autotest.sh@32 -- # uname -s 00:03:18.024 05:02:49 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:18.025 05:02:49 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:18.025 05:02:49 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:18.025 05:02:49 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:18.025 05:02:49 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:18.025 05:02:49 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:18.025 05:02:49 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:18.025 05:02:49 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:18.025 05:02:49 -- spdk/autotest.sh@48 -- # udevadm_pid=3066813 00:03:18.025 05:02:49 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:18.025 05:02:49 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:18.025 05:02:49 -- spdk/autotest.sh@54 -- # echo 3066815 00:03:18.025 05:02:49 -- spdk/autotest.sh@56 -- # echo 3066816 00:03:18.025 05:02:49 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:18.025 05:02:49 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:18.025 05:02:49 -- spdk/autotest.sh@60 -- # echo 3066817 00:03:18.025 05:02:49 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:18.025 05:02:49 -- spdk/autotest.sh@62 -- # echo 3066818 00:03:18.025 05:02:49 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:18.025 05:02:49 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:18.025 05:02:49 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:18.025 05:02:49 -- common/autotest_common.sh@10 -- # set +x 00:03:18.285 05:02:49 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:18.285 05:02:49 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:18.285 05:02:49 -- spdk/autotest.sh@70 -- # create_test_list 00:03:18.285 05:02:49 -- common/autotest_common.sh@736 -- # xtrace_disable 00:03:18.285 05:02:49 -- common/autotest_common.sh@10 -- # set +x 00:03:18.285 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:18.285 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:18.285 05:02:49 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:18.285 05:02:49 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:18.285 05:02:49 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:18.285 05:02:49 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:18.285 05:02:49 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:18.285 05:02:49 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:18.285 05:02:49 -- common/autotest_common.sh@1440 -- # uname 00:03:18.285 05:02:49 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:03:18.285 05:02:49 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:18.285 05:02:49 -- common/autotest_common.sh@1460 -- # uname 00:03:18.285 05:02:49 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:03:18.285 05:02:49 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:03:18.285 05:02:49 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:03:18.285 05:02:49 -- spdk/autotest.sh@83 -- # hash lcov 00:03:18.285 05:02:49 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:03:18.285 05:02:49 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:03:18.285 05:02:49 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:18.285 05:02:49 -- common/autotest_common.sh@10 -- # set +x 00:03:18.285 05:02:49 -- spdk/autotest.sh@102 -- # rm -f 00:03:18.285 05:02:49 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:22.480 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:22.480 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:22.480 05:02:53 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:03:22.480 05:02:53 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:22.480 05:02:53 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:22.480 05:02:53 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:22.480 05:02:53 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:22.480 05:02:53 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:22.480 05:02:53 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:22.480 05:02:53 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:22.480 05:02:53 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:22.480 05:02:53 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:03:22.480 05:02:53 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:03:22.480 05:02:53 -- spdk/autotest.sh@121 -- # grep -v p 00:03:22.480 05:02:53 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:22.480 05:02:53 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:03:22.480 05:02:53 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:03:22.480 05:02:53 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:22.480 05:02:53 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:22.480 No valid GPT data, bailing 00:03:22.480 05:02:53 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:22.480 05:02:53 -- scripts/common.sh@393 -- # pt= 00:03:22.480 05:02:53 -- scripts/common.sh@394 -- # return 1 00:03:22.480 05:02:53 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:22.480 1+0 records in 00:03:22.480 1+0 records out 00:03:22.480 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00587182 s, 179 MB/s 00:03:22.480 05:02:53 -- spdk/autotest.sh@129 -- # sync 00:03:22.480 05:02:53 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:22.480 05:02:53 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:22.480 05:02:53 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:29.053 05:02:59 -- spdk/autotest.sh@135 -- # uname -s 00:03:29.053 05:02:59 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:03:29.054 05:02:59 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:29.054 05:02:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:29.054 05:02:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:29.054 05:02:59 -- common/autotest_common.sh@10 -- # set +x 00:03:29.054 ************************************ 00:03:29.054 START TEST setup.sh 00:03:29.054 ************************************ 00:03:29.054 05:02:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:29.054 * Looking for test storage... 00:03:29.054 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:29.054 05:02:59 -- setup/test-setup.sh@10 -- # uname -s 00:03:29.054 05:02:59 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:29.054 05:02:59 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:29.054 05:02:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:29.054 05:02:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:29.054 05:02:59 -- common/autotest_common.sh@10 -- # set +x 00:03:29.054 ************************************ 00:03:29.054 START TEST acl 00:03:29.054 ************************************ 00:03:29.054 05:02:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:29.054 * Looking for test storage... 00:03:29.054 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:29.054 05:02:59 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:29.054 05:02:59 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:29.054 05:02:59 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:29.054 05:02:59 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:29.054 05:02:59 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:29.054 05:02:59 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:29.054 05:02:59 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:29.054 05:02:59 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:29.054 05:02:59 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:29.054 05:02:59 -- setup/acl.sh@12 -- # devs=() 00:03:29.054 05:02:59 -- setup/acl.sh@12 -- # declare -a devs 00:03:29.054 05:02:59 -- setup/acl.sh@13 -- # drivers=() 00:03:29.054 05:02:59 -- setup/acl.sh@13 -- # declare -A drivers 00:03:29.054 05:02:59 -- setup/acl.sh@51 -- # setup reset 00:03:29.054 05:02:59 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:29.054 05:02:59 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:33.280 05:03:03 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:33.280 05:03:03 -- setup/acl.sh@16 -- # local dev driver 00:03:33.280 05:03:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.280 05:03:03 -- setup/acl.sh@15 -- # setup output status 00:03:33.280 05:03:03 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:33.280 05:03:03 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:35.812 Hugepages 00:03:35.812 node hugesize free / total 00:03:35.812 05:03:06 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:35.812 05:03:06 -- setup/acl.sh@19 -- # continue 00:03:35.812 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:35.812 05:03:06 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:35.812 05:03:06 -- setup/acl.sh@19 -- # continue 00:03:35.812 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:35.812 05:03:06 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:35.812 05:03:06 -- setup/acl.sh@19 -- # continue 00:03:35.812 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:35.812 00:03:35.812 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:35.812 05:03:06 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:35.812 05:03:06 -- setup/acl.sh@19 -- # continue 00:03:35.812 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:35.812 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:35.812 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:35.812 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:35.812 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.071 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.071 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.071 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:36.072 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.072 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.072 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.072 05:03:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:36.072 05:03:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:36.072 05:03:06 -- setup/acl.sh@20 -- # continue 00:03:36.072 05:03:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.072 05:03:07 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:36.072 05:03:07 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:36.072 05:03:07 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:36.072 05:03:07 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:36.072 05:03:07 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:36.072 05:03:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:36.072 05:03:07 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:36.072 05:03:07 -- setup/acl.sh@54 -- # run_test denied denied 00:03:36.072 05:03:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:36.072 05:03:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:36.072 05:03:07 -- common/autotest_common.sh@10 -- # set +x 00:03:36.072 ************************************ 00:03:36.072 START TEST denied 00:03:36.072 ************************************ 00:03:36.072 05:03:07 -- common/autotest_common.sh@1104 -- # denied 00:03:36.072 05:03:07 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:36.072 05:03:07 -- setup/acl.sh@38 -- # setup output config 00:03:36.072 05:03:07 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:36.072 05:03:07 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:36.072 05:03:07 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:40.265 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:40.265 05:03:10 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:40.265 05:03:10 -- setup/acl.sh@28 -- # local dev driver 00:03:40.265 05:03:10 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:40.265 05:03:10 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:40.265 05:03:10 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:40.265 05:03:10 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:40.265 05:03:10 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:40.265 05:03:10 -- setup/acl.sh@41 -- # setup reset 00:03:40.265 05:03:10 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:40.265 05:03:10 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:44.462 00:03:44.462 real 0m8.437s 00:03:44.462 user 0m2.637s 00:03:44.462 sys 0m5.112s 00:03:44.462 05:03:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:44.462 05:03:15 -- common/autotest_common.sh@10 -- # set +x 00:03:44.462 ************************************ 00:03:44.462 END TEST denied 00:03:44.462 ************************************ 00:03:44.722 05:03:15 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:44.722 05:03:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:44.722 05:03:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:44.722 05:03:15 -- common/autotest_common.sh@10 -- # set +x 00:03:44.722 ************************************ 00:03:44.722 START TEST allowed 00:03:44.722 ************************************ 00:03:44.722 05:03:15 -- common/autotest_common.sh@1104 -- # allowed 00:03:44.722 05:03:15 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:44.722 05:03:15 -- setup/acl.sh@45 -- # setup output config 00:03:44.722 05:03:15 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:44.722 05:03:15 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:44.722 05:03:15 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:49.999 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:49.999 05:03:20 -- setup/acl.sh@47 -- # verify 00:03:49.999 05:03:20 -- setup/acl.sh@28 -- # local dev driver 00:03:49.999 05:03:20 -- setup/acl.sh@48 -- # setup reset 00:03:49.999 05:03:20 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:49.999 05:03:20 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:54.197 00:03:54.197 real 0m9.044s 00:03:54.197 user 0m2.566s 00:03:54.197 sys 0m5.063s 00:03:54.197 05:03:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:54.197 05:03:24 -- common/autotest_common.sh@10 -- # set +x 00:03:54.197 ************************************ 00:03:54.197 END TEST allowed 00:03:54.197 ************************************ 00:03:54.197 00:03:54.197 real 0m25.090s 00:03:54.197 user 0m7.899s 00:03:54.197 sys 0m15.398s 00:03:54.197 05:03:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:54.197 05:03:24 -- common/autotest_common.sh@10 -- # set +x 00:03:54.197 ************************************ 00:03:54.197 END TEST acl 00:03:54.197 ************************************ 00:03:54.197 05:03:24 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:54.197 05:03:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:54.197 05:03:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:54.197 05:03:24 -- common/autotest_common.sh@10 -- # set +x 00:03:54.197 ************************************ 00:03:54.197 START TEST hugepages 00:03:54.197 ************************************ 00:03:54.197 05:03:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:54.197 * Looking for test storage... 00:03:54.197 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:54.197 05:03:24 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:54.197 05:03:24 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:54.197 05:03:24 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:54.197 05:03:24 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:54.197 05:03:24 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:54.197 05:03:24 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:54.197 05:03:24 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:54.197 05:03:24 -- setup/common.sh@18 -- # local node= 00:03:54.198 05:03:24 -- setup/common.sh@19 -- # local var val 00:03:54.198 05:03:24 -- setup/common.sh@20 -- # local mem_f mem 00:03:54.198 05:03:24 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.198 05:03:24 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.198 05:03:24 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.198 05:03:24 -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.198 05:03:24 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 38309616 kB' 'MemAvailable: 38940628 kB' 'Buffers: 4304 kB' 'Cached: 12146292 kB' 'SwapCached: 1968 kB' 'Active: 8001844 kB' 'Inactive: 4694792 kB' 'Active(anon): 7490256 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 547516 kB' 'Mapped: 198924 kB' 'Shmem: 11222956 kB' 'KReclaimable: 572252 kB' 'Slab: 1241008 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 668756 kB' 'KernelStack: 22176 kB' 'PageTables: 8932 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439068 kB' 'Committed_AS: 14276828 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216532 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.198 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # continue 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 05:03:24 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 05:03:24 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.199 05:03:24 -- setup/common.sh@33 -- # echo 2048 00:03:54.199 05:03:24 -- setup/common.sh@33 -- # return 0 00:03:54.199 05:03:24 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:54.199 05:03:24 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:54.199 05:03:24 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:54.199 05:03:24 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:54.199 05:03:24 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:54.199 05:03:24 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:54.199 05:03:24 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:54.199 05:03:24 -- setup/hugepages.sh@207 -- # get_nodes 00:03:54.199 05:03:24 -- setup/hugepages.sh@27 -- # local node 00:03:54.199 05:03:24 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.199 05:03:24 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:54.199 05:03:24 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.199 05:03:24 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:54.199 05:03:24 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:54.199 05:03:24 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:54.199 05:03:24 -- setup/hugepages.sh@208 -- # clear_hp 00:03:54.199 05:03:24 -- setup/hugepages.sh@37 -- # local node hp 00:03:54.199 05:03:24 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:54.199 05:03:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.199 05:03:24 -- setup/hugepages.sh@41 -- # echo 0 00:03:54.199 05:03:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.199 05:03:24 -- setup/hugepages.sh@41 -- # echo 0 00:03:54.199 05:03:24 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:54.199 05:03:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.199 05:03:24 -- setup/hugepages.sh@41 -- # echo 0 00:03:54.199 05:03:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.199 05:03:24 -- setup/hugepages.sh@41 -- # echo 0 00:03:54.199 05:03:24 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:54.200 05:03:24 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:54.200 05:03:24 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:54.200 05:03:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:54.200 05:03:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:54.200 05:03:24 -- common/autotest_common.sh@10 -- # set +x 00:03:54.200 ************************************ 00:03:54.200 START TEST default_setup 00:03:54.200 ************************************ 00:03:54.200 05:03:24 -- common/autotest_common.sh@1104 -- # default_setup 00:03:54.200 05:03:24 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:54.200 05:03:24 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:54.200 05:03:24 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:54.200 05:03:24 -- setup/hugepages.sh@51 -- # shift 00:03:54.200 05:03:24 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:54.200 05:03:24 -- setup/hugepages.sh@52 -- # local node_ids 00:03:54.200 05:03:24 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:54.200 05:03:24 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:54.200 05:03:24 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:54.200 05:03:24 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:54.200 05:03:24 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:54.200 05:03:24 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:54.200 05:03:24 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:54.200 05:03:24 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:54.200 05:03:24 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:54.200 05:03:24 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:54.200 05:03:24 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:54.200 05:03:24 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:54.200 05:03:24 -- setup/hugepages.sh@73 -- # return 0 00:03:54.200 05:03:24 -- setup/hugepages.sh@137 -- # setup output 00:03:54.200 05:03:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:54.200 05:03:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:57.555 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:57.555 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:58.935 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:59.198 05:03:30 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:59.199 05:03:30 -- setup/hugepages.sh@89 -- # local node 00:03:59.199 05:03:30 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:59.199 05:03:30 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:59.199 05:03:30 -- setup/hugepages.sh@92 -- # local surp 00:03:59.199 05:03:30 -- setup/hugepages.sh@93 -- # local resv 00:03:59.199 05:03:30 -- setup/hugepages.sh@94 -- # local anon 00:03:59.199 05:03:30 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:59.199 05:03:30 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:59.199 05:03:30 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:59.199 05:03:30 -- setup/common.sh@18 -- # local node= 00:03:59.199 05:03:30 -- setup/common.sh@19 -- # local var val 00:03:59.199 05:03:30 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.199 05:03:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.199 05:03:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.199 05:03:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.199 05:03:30 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.199 05:03:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40605360 kB' 'MemAvailable: 41236372 kB' 'Buffers: 4304 kB' 'Cached: 12146420 kB' 'SwapCached: 1968 kB' 'Active: 8022304 kB' 'Inactive: 4694792 kB' 'Active(anon): 7510716 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 567812 kB' 'Mapped: 199204 kB' 'Shmem: 11223084 kB' 'KReclaimable: 572252 kB' 'Slab: 1238964 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666712 kB' 'KernelStack: 22384 kB' 'PageTables: 9564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14298388 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216500 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.199 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.199 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.200 05:03:30 -- setup/common.sh@33 -- # echo 0 00:03:59.200 05:03:30 -- setup/common.sh@33 -- # return 0 00:03:59.200 05:03:30 -- setup/hugepages.sh@97 -- # anon=0 00:03:59.200 05:03:30 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:59.200 05:03:30 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.200 05:03:30 -- setup/common.sh@18 -- # local node= 00:03:59.200 05:03:30 -- setup/common.sh@19 -- # local var val 00:03:59.200 05:03:30 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.200 05:03:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.200 05:03:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.200 05:03:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.200 05:03:30 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.200 05:03:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40606904 kB' 'MemAvailable: 41237916 kB' 'Buffers: 4304 kB' 'Cached: 12146420 kB' 'SwapCached: 1968 kB' 'Active: 8020852 kB' 'Inactive: 4694792 kB' 'Active(anon): 7509264 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566860 kB' 'Mapped: 199104 kB' 'Shmem: 11223084 kB' 'KReclaimable: 572252 kB' 'Slab: 1238868 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666616 kB' 'KernelStack: 22288 kB' 'PageTables: 8992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14298564 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216548 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.200 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.200 05:03:30 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.201 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.201 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.202 05:03:30 -- setup/common.sh@33 -- # echo 0 00:03:59.202 05:03:30 -- setup/common.sh@33 -- # return 0 00:03:59.202 05:03:30 -- setup/hugepages.sh@99 -- # surp=0 00:03:59.202 05:03:30 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:59.202 05:03:30 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:59.202 05:03:30 -- setup/common.sh@18 -- # local node= 00:03:59.202 05:03:30 -- setup/common.sh@19 -- # local var val 00:03:59.202 05:03:30 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.202 05:03:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.202 05:03:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.202 05:03:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.202 05:03:30 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.202 05:03:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40604640 kB' 'MemAvailable: 41235652 kB' 'Buffers: 4304 kB' 'Cached: 12146432 kB' 'SwapCached: 1968 kB' 'Active: 8021804 kB' 'Inactive: 4694792 kB' 'Active(anon): 7510216 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 567748 kB' 'Mapped: 199096 kB' 'Shmem: 11223096 kB' 'KReclaimable: 572252 kB' 'Slab: 1238868 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666616 kB' 'KernelStack: 22432 kB' 'PageTables: 9444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14298344 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216548 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.202 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.202 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.203 05:03:30 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.203 05:03:30 -- setup/common.sh@33 -- # echo 0 00:03:59.203 05:03:30 -- setup/common.sh@33 -- # return 0 00:03:59.203 05:03:30 -- setup/hugepages.sh@100 -- # resv=0 00:03:59.203 05:03:30 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:59.203 nr_hugepages=1024 00:03:59.203 05:03:30 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:59.203 resv_hugepages=0 00:03:59.203 05:03:30 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:59.203 surplus_hugepages=0 00:03:59.203 05:03:30 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:59.203 anon_hugepages=0 00:03:59.203 05:03:30 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.203 05:03:30 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:59.203 05:03:30 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:59.203 05:03:30 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:59.203 05:03:30 -- setup/common.sh@18 -- # local node= 00:03:59.203 05:03:30 -- setup/common.sh@19 -- # local var val 00:03:59.203 05:03:30 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.203 05:03:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.203 05:03:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.203 05:03:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.203 05:03:30 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.203 05:03:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.203 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40604904 kB' 'MemAvailable: 41235916 kB' 'Buffers: 4304 kB' 'Cached: 12146436 kB' 'SwapCached: 1968 kB' 'Active: 8020832 kB' 'Inactive: 4694792 kB' 'Active(anon): 7509244 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 566584 kB' 'Mapped: 199104 kB' 'Shmem: 11223100 kB' 'KReclaimable: 572252 kB' 'Slab: 1238860 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666608 kB' 'KernelStack: 22256 kB' 'PageTables: 9124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14296968 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216500 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.204 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.204 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.205 05:03:30 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.205 05:03:30 -- setup/common.sh@33 -- # echo 1024 00:03:59.205 05:03:30 -- setup/common.sh@33 -- # return 0 00:03:59.205 05:03:30 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.205 05:03:30 -- setup/hugepages.sh@112 -- # get_nodes 00:03:59.205 05:03:30 -- setup/hugepages.sh@27 -- # local node 00:03:59.205 05:03:30 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.205 05:03:30 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:59.205 05:03:30 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.205 05:03:30 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:59.205 05:03:30 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:59.205 05:03:30 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:59.205 05:03:30 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:59.205 05:03:30 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:59.205 05:03:30 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:59.205 05:03:30 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.205 05:03:30 -- setup/common.sh@18 -- # local node=0 00:03:59.205 05:03:30 -- setup/common.sh@19 -- # local var val 00:03:59.205 05:03:30 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.205 05:03:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.205 05:03:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:59.205 05:03:30 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:59.205 05:03:30 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.205 05:03:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.205 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 22090356 kB' 'MemUsed: 10501728 kB' 'SwapCached: 368 kB' 'Active: 5377624 kB' 'Inactive: 358904 kB' 'Active(anon): 5114648 kB' 'Inactive(anon): 972 kB' 'Active(file): 262976 kB' 'Inactive(file): 357932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5441624 kB' 'Mapped: 114476 kB' 'AnonPages: 298232 kB' 'Shmem: 4820348 kB' 'KernelStack: 12808 kB' 'PageTables: 4628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347864 kB' 'Slab: 661940 kB' 'SReclaimable: 347864 kB' 'SUnreclaim: 314076 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.206 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.206 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.207 05:03:30 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.207 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.207 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.207 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.207 05:03:30 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.207 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.207 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.207 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.207 05:03:30 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.207 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.207 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.207 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.207 05:03:30 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.207 05:03:30 -- setup/common.sh@32 -- # continue 00:03:59.207 05:03:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.207 05:03:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.207 05:03:30 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.207 05:03:30 -- setup/common.sh@33 -- # echo 0 00:03:59.207 05:03:30 -- setup/common.sh@33 -- # return 0 00:03:59.207 05:03:30 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:59.207 05:03:30 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:59.207 05:03:30 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:59.207 05:03:30 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:59.207 05:03:30 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:59.207 node0=1024 expecting 1024 00:03:59.207 05:03:30 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:59.207 00:03:59.207 real 0m5.332s 00:03:59.207 user 0m1.360s 00:03:59.207 sys 0m2.435s 00:03:59.207 05:03:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:59.207 05:03:30 -- common/autotest_common.sh@10 -- # set +x 00:03:59.207 ************************************ 00:03:59.207 END TEST default_setup 00:03:59.207 ************************************ 00:03:59.207 05:03:30 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:59.207 05:03:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:59.207 05:03:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:59.207 05:03:30 -- common/autotest_common.sh@10 -- # set +x 00:03:59.207 ************************************ 00:03:59.207 START TEST per_node_1G_alloc 00:03:59.207 ************************************ 00:03:59.207 05:03:30 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:03:59.207 05:03:30 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:59.207 05:03:30 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:59.207 05:03:30 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:59.207 05:03:30 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:59.207 05:03:30 -- setup/hugepages.sh@51 -- # shift 00:03:59.207 05:03:30 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:59.207 05:03:30 -- setup/hugepages.sh@52 -- # local node_ids 00:03:59.207 05:03:30 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:59.207 05:03:30 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:59.207 05:03:30 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:59.207 05:03:30 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:59.207 05:03:30 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:59.207 05:03:30 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:59.207 05:03:30 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:59.207 05:03:30 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:59.207 05:03:30 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:59.207 05:03:30 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:59.207 05:03:30 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:59.207 05:03:30 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:59.207 05:03:30 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:59.207 05:03:30 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:59.207 05:03:30 -- setup/hugepages.sh@73 -- # return 0 00:03:59.207 05:03:30 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:59.207 05:03:30 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:59.207 05:03:30 -- setup/hugepages.sh@146 -- # setup output 00:03:59.207 05:03:30 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:59.207 05:03:30 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:03.411 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:03.411 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:03.411 05:03:33 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:03.411 05:03:33 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:03.411 05:03:33 -- setup/hugepages.sh@89 -- # local node 00:04:03.411 05:03:33 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:03.411 05:03:33 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:03.411 05:03:33 -- setup/hugepages.sh@92 -- # local surp 00:04:03.411 05:03:33 -- setup/hugepages.sh@93 -- # local resv 00:04:03.411 05:03:33 -- setup/hugepages.sh@94 -- # local anon 00:04:03.411 05:03:33 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:03.411 05:03:33 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:03.411 05:03:33 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:03.411 05:03:33 -- setup/common.sh@18 -- # local node= 00:04:03.411 05:03:33 -- setup/common.sh@19 -- # local var val 00:04:03.411 05:03:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.411 05:03:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.411 05:03:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.411 05:03:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.411 05:03:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.411 05:03:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.411 05:03:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40606912 kB' 'MemAvailable: 41237924 kB' 'Buffers: 4304 kB' 'Cached: 12146544 kB' 'SwapCached: 1968 kB' 'Active: 8017004 kB' 'Inactive: 4694792 kB' 'Active(anon): 7505416 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561844 kB' 'Mapped: 198044 kB' 'Shmem: 11223208 kB' 'KReclaimable: 572252 kB' 'Slab: 1238932 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666680 kB' 'KernelStack: 22176 kB' 'PageTables: 8660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14284096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216596 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.411 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.411 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.412 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.412 05:03:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.413 05:03:33 -- setup/common.sh@33 -- # echo 0 00:04:03.413 05:03:33 -- setup/common.sh@33 -- # return 0 00:04:03.413 05:03:33 -- setup/hugepages.sh@97 -- # anon=0 00:04:03.413 05:03:33 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:03.413 05:03:33 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.413 05:03:33 -- setup/common.sh@18 -- # local node= 00:04:03.413 05:03:33 -- setup/common.sh@19 -- # local var val 00:04:03.413 05:03:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.413 05:03:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.413 05:03:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.413 05:03:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.413 05:03:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.413 05:03:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40606900 kB' 'MemAvailable: 41237912 kB' 'Buffers: 4304 kB' 'Cached: 12146548 kB' 'SwapCached: 1968 kB' 'Active: 8015596 kB' 'Inactive: 4694792 kB' 'Active(anon): 7504008 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 560928 kB' 'Mapped: 197912 kB' 'Shmem: 11223212 kB' 'KReclaimable: 572252 kB' 'Slab: 1238920 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666668 kB' 'KernelStack: 22176 kB' 'PageTables: 8644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14284104 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216596 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.413 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.413 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.414 05:03:33 -- setup/common.sh@33 -- # echo 0 00:04:03.414 05:03:33 -- setup/common.sh@33 -- # return 0 00:04:03.414 05:03:33 -- setup/hugepages.sh@99 -- # surp=0 00:04:03.414 05:03:33 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:03.414 05:03:33 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:03.414 05:03:33 -- setup/common.sh@18 -- # local node= 00:04:03.414 05:03:33 -- setup/common.sh@19 -- # local var val 00:04:03.414 05:03:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.414 05:03:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.414 05:03:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.414 05:03:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.414 05:03:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.414 05:03:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40606648 kB' 'MemAvailable: 41237660 kB' 'Buffers: 4304 kB' 'Cached: 12146560 kB' 'SwapCached: 1968 kB' 'Active: 8015644 kB' 'Inactive: 4694792 kB' 'Active(anon): 7504056 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 560928 kB' 'Mapped: 197912 kB' 'Shmem: 11223224 kB' 'KReclaimable: 572252 kB' 'Slab: 1238920 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666668 kB' 'KernelStack: 22176 kB' 'PageTables: 8644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14284120 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216612 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.414 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.414 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.415 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.415 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.416 05:03:33 -- setup/common.sh@33 -- # echo 0 00:04:03.416 05:03:33 -- setup/common.sh@33 -- # return 0 00:04:03.416 05:03:33 -- setup/hugepages.sh@100 -- # resv=0 00:04:03.416 05:03:33 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:03.416 nr_hugepages=1024 00:04:03.416 05:03:33 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:03.416 resv_hugepages=0 00:04:03.416 05:03:33 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:03.416 surplus_hugepages=0 00:04:03.416 05:03:33 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:03.416 anon_hugepages=0 00:04:03.416 05:03:33 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:03.416 05:03:33 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:03.416 05:03:33 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:03.416 05:03:33 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:03.416 05:03:33 -- setup/common.sh@18 -- # local node= 00:04:03.416 05:03:33 -- setup/common.sh@19 -- # local var val 00:04:03.416 05:03:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.416 05:03:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.416 05:03:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.416 05:03:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.416 05:03:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.416 05:03:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40606648 kB' 'MemAvailable: 41237660 kB' 'Buffers: 4304 kB' 'Cached: 12146576 kB' 'SwapCached: 1968 kB' 'Active: 8015480 kB' 'Inactive: 4694792 kB' 'Active(anon): 7503892 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 560752 kB' 'Mapped: 197912 kB' 'Shmem: 11223240 kB' 'KReclaimable: 572252 kB' 'Slab: 1238920 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666668 kB' 'KernelStack: 22160 kB' 'PageTables: 8588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14284136 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216612 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.416 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.416 05:03:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.417 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.417 05:03:33 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.418 05:03:33 -- setup/common.sh@33 -- # echo 1024 00:04:03.418 05:03:33 -- setup/common.sh@33 -- # return 0 00:04:03.418 05:03:33 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:03.418 05:03:33 -- setup/hugepages.sh@112 -- # get_nodes 00:04:03.418 05:03:33 -- setup/hugepages.sh@27 -- # local node 00:04:03.418 05:03:33 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:03.418 05:03:33 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:03.418 05:03:33 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:03.418 05:03:33 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:03.418 05:03:33 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:03.418 05:03:33 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:03.418 05:03:33 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:03.418 05:03:33 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:03.418 05:03:33 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:03.418 05:03:33 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.418 05:03:33 -- setup/common.sh@18 -- # local node=0 00:04:03.418 05:03:33 -- setup/common.sh@19 -- # local var val 00:04:03.418 05:03:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.418 05:03:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.418 05:03:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:03.418 05:03:33 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:03.418 05:03:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.418 05:03:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 23158244 kB' 'MemUsed: 9433840 kB' 'SwapCached: 368 kB' 'Active: 5377700 kB' 'Inactive: 358904 kB' 'Active(anon): 5114724 kB' 'Inactive(anon): 972 kB' 'Active(file): 262976 kB' 'Inactive(file): 357932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5441664 kB' 'Mapped: 114188 kB' 'AnonPages: 298204 kB' 'Shmem: 4820388 kB' 'KernelStack: 12840 kB' 'PageTables: 4740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347864 kB' 'Slab: 662056 kB' 'SReclaimable: 347864 kB' 'SUnreclaim: 314192 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.418 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.418 05:03:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@33 -- # echo 0 00:04:03.419 05:03:33 -- setup/common.sh@33 -- # return 0 00:04:03.419 05:03:33 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:03.419 05:03:33 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:03.419 05:03:33 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:03.419 05:03:33 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:03.419 05:03:33 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.419 05:03:33 -- setup/common.sh@18 -- # local node=1 00:04:03.419 05:03:33 -- setup/common.sh@19 -- # local var val 00:04:03.419 05:03:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.419 05:03:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.419 05:03:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:03.419 05:03:33 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:03.419 05:03:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.419 05:03:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 17449348 kB' 'MemUsed: 10253800 kB' 'SwapCached: 1600 kB' 'Active: 2637980 kB' 'Inactive: 4335888 kB' 'Active(anon): 2389368 kB' 'Inactive(anon): 4277768 kB' 'Active(file): 248612 kB' 'Inactive(file): 58120 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6711212 kB' 'Mapped: 83724 kB' 'AnonPages: 262736 kB' 'Shmem: 6402880 kB' 'KernelStack: 9336 kB' 'PageTables: 3904 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224388 kB' 'Slab: 576864 kB' 'SReclaimable: 224388 kB' 'SUnreclaim: 352476 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.419 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.419 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # continue 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.420 05:03:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.420 05:03:33 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.420 05:03:33 -- setup/common.sh@33 -- # echo 0 00:04:03.420 05:03:33 -- setup/common.sh@33 -- # return 0 00:04:03.420 05:03:33 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:03.420 05:03:33 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:03.420 05:03:33 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:03.420 05:03:33 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:03.420 05:03:33 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:03.420 node0=512 expecting 512 00:04:03.420 05:03:33 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:03.420 05:03:33 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:03.420 05:03:33 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:03.420 05:03:33 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:03.420 node1=512 expecting 512 00:04:03.420 05:03:33 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:03.420 00:04:03.420 real 0m3.727s 00:04:03.420 user 0m1.417s 00:04:03.420 sys 0m2.383s 00:04:03.420 05:03:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:03.420 05:03:34 -- common/autotest_common.sh@10 -- # set +x 00:04:03.420 ************************************ 00:04:03.420 END TEST per_node_1G_alloc 00:04:03.420 ************************************ 00:04:03.421 05:03:34 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:03.421 05:03:34 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:03.421 05:03:34 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:03.421 05:03:34 -- common/autotest_common.sh@10 -- # set +x 00:04:03.421 ************************************ 00:04:03.421 START TEST even_2G_alloc 00:04:03.421 ************************************ 00:04:03.421 05:03:34 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:04:03.421 05:03:34 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:03.421 05:03:34 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:03.421 05:03:34 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:03.421 05:03:34 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:03.421 05:03:34 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:03.421 05:03:34 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:03.421 05:03:34 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:03.421 05:03:34 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:03.421 05:03:34 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:03.421 05:03:34 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:03.421 05:03:34 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:03.421 05:03:34 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:03.421 05:03:34 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:03.421 05:03:34 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:03.421 05:03:34 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.421 05:03:34 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:03.421 05:03:34 -- setup/hugepages.sh@83 -- # : 512 00:04:03.421 05:03:34 -- setup/hugepages.sh@84 -- # : 1 00:04:03.421 05:03:34 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.421 05:03:34 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:03.421 05:03:34 -- setup/hugepages.sh@83 -- # : 0 00:04:03.421 05:03:34 -- setup/hugepages.sh@84 -- # : 0 00:04:03.421 05:03:34 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.421 05:03:34 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:03.421 05:03:34 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:03.421 05:03:34 -- setup/hugepages.sh@153 -- # setup output 00:04:03.421 05:03:34 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:03.421 05:03:34 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:06.718 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:06.718 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:06.718 05:03:37 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:06.718 05:03:37 -- setup/hugepages.sh@89 -- # local node 00:04:06.718 05:03:37 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:06.718 05:03:37 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:06.718 05:03:37 -- setup/hugepages.sh@92 -- # local surp 00:04:06.718 05:03:37 -- setup/hugepages.sh@93 -- # local resv 00:04:06.718 05:03:37 -- setup/hugepages.sh@94 -- # local anon 00:04:06.718 05:03:37 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:06.718 05:03:37 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:06.718 05:03:37 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:06.718 05:03:37 -- setup/common.sh@18 -- # local node= 00:04:06.718 05:03:37 -- setup/common.sh@19 -- # local var val 00:04:06.718 05:03:37 -- setup/common.sh@20 -- # local mem_f mem 00:04:06.718 05:03:37 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.718 05:03:37 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:06.718 05:03:37 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:06.718 05:03:37 -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.718 05:03:37 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.718 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.718 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40655492 kB' 'MemAvailable: 41286504 kB' 'Buffers: 4304 kB' 'Cached: 12146680 kB' 'SwapCached: 1968 kB' 'Active: 8016688 kB' 'Inactive: 4694792 kB' 'Active(anon): 7505100 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561728 kB' 'Mapped: 197940 kB' 'Shmem: 11223344 kB' 'KReclaimable: 572252 kB' 'Slab: 1238984 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666732 kB' 'KernelStack: 22192 kB' 'PageTables: 8680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14284756 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216612 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.719 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.719 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.720 05:03:37 -- setup/common.sh@33 -- # echo 0 00:04:06.720 05:03:37 -- setup/common.sh@33 -- # return 0 00:04:06.720 05:03:37 -- setup/hugepages.sh@97 -- # anon=0 00:04:06.720 05:03:37 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:06.720 05:03:37 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:06.720 05:03:37 -- setup/common.sh@18 -- # local node= 00:04:06.720 05:03:37 -- setup/common.sh@19 -- # local var val 00:04:06.720 05:03:37 -- setup/common.sh@20 -- # local mem_f mem 00:04:06.720 05:03:37 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.720 05:03:37 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:06.720 05:03:37 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:06.720 05:03:37 -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.720 05:03:37 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40656908 kB' 'MemAvailable: 41287920 kB' 'Buffers: 4304 kB' 'Cached: 12146684 kB' 'SwapCached: 1968 kB' 'Active: 8016420 kB' 'Inactive: 4694792 kB' 'Active(anon): 7504832 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561496 kB' 'Mapped: 197916 kB' 'Shmem: 11223348 kB' 'KReclaimable: 572252 kB' 'Slab: 1239044 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666792 kB' 'KernelStack: 22176 kB' 'PageTables: 8636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14284768 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216596 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.720 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.720 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.721 05:03:37 -- setup/common.sh@33 -- # echo 0 00:04:06.721 05:03:37 -- setup/common.sh@33 -- # return 0 00:04:06.721 05:03:37 -- setup/hugepages.sh@99 -- # surp=0 00:04:06.721 05:03:37 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:06.721 05:03:37 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:06.721 05:03:37 -- setup/common.sh@18 -- # local node= 00:04:06.721 05:03:37 -- setup/common.sh@19 -- # local var val 00:04:06.721 05:03:37 -- setup/common.sh@20 -- # local mem_f mem 00:04:06.721 05:03:37 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.721 05:03:37 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:06.721 05:03:37 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:06.721 05:03:37 -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.721 05:03:37 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40659200 kB' 'MemAvailable: 41290212 kB' 'Buffers: 4304 kB' 'Cached: 12146696 kB' 'SwapCached: 1968 kB' 'Active: 8016428 kB' 'Inactive: 4694792 kB' 'Active(anon): 7504840 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561488 kB' 'Mapped: 197916 kB' 'Shmem: 11223360 kB' 'KReclaimable: 572252 kB' 'Slab: 1239044 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666792 kB' 'KernelStack: 22176 kB' 'PageTables: 8636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14284780 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216596 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.721 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.721 05:03:37 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.722 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.722 05:03:37 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.723 05:03:37 -- setup/common.sh@33 -- # echo 0 00:04:06.723 05:03:37 -- setup/common.sh@33 -- # return 0 00:04:06.723 05:03:37 -- setup/hugepages.sh@100 -- # resv=0 00:04:06.723 05:03:37 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:06.723 nr_hugepages=1024 00:04:06.723 05:03:37 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:06.723 resv_hugepages=0 00:04:06.723 05:03:37 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:06.723 surplus_hugepages=0 00:04:06.723 05:03:37 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:06.723 anon_hugepages=0 00:04:06.723 05:03:37 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:06.723 05:03:37 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:06.723 05:03:37 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:06.723 05:03:37 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:06.723 05:03:37 -- setup/common.sh@18 -- # local node= 00:04:06.723 05:03:37 -- setup/common.sh@19 -- # local var val 00:04:06.723 05:03:37 -- setup/common.sh@20 -- # local mem_f mem 00:04:06.723 05:03:37 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.723 05:03:37 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:06.723 05:03:37 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:06.723 05:03:37 -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.723 05:03:37 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40659452 kB' 'MemAvailable: 41290464 kB' 'Buffers: 4304 kB' 'Cached: 12146724 kB' 'SwapCached: 1968 kB' 'Active: 8016080 kB' 'Inactive: 4694792 kB' 'Active(anon): 7504492 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561080 kB' 'Mapped: 197916 kB' 'Shmem: 11223388 kB' 'KReclaimable: 572252 kB' 'Slab: 1239044 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666792 kB' 'KernelStack: 22160 kB' 'PageTables: 8580 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14284796 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216596 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.723 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.723 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.724 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.724 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.724 05:03:37 -- setup/common.sh@33 -- # echo 1024 00:04:06.725 05:03:37 -- setup/common.sh@33 -- # return 0 00:04:06.725 05:03:37 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:06.725 05:03:37 -- setup/hugepages.sh@112 -- # get_nodes 00:04:06.725 05:03:37 -- setup/hugepages.sh@27 -- # local node 00:04:06.725 05:03:37 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:06.725 05:03:37 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:06.725 05:03:37 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:06.725 05:03:37 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:06.725 05:03:37 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:06.725 05:03:37 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:06.725 05:03:37 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:06.725 05:03:37 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:06.725 05:03:37 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:06.725 05:03:37 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:06.725 05:03:37 -- setup/common.sh@18 -- # local node=0 00:04:06.725 05:03:37 -- setup/common.sh@19 -- # local var val 00:04:06.725 05:03:37 -- setup/common.sh@20 -- # local mem_f mem 00:04:06.725 05:03:37 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.725 05:03:37 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:06.725 05:03:37 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:06.725 05:03:37 -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.725 05:03:37 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 23183976 kB' 'MemUsed: 9408108 kB' 'SwapCached: 368 kB' 'Active: 5378908 kB' 'Inactive: 358904 kB' 'Active(anon): 5115932 kB' 'Inactive(anon): 972 kB' 'Active(file): 262976 kB' 'Inactive(file): 357932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5441716 kB' 'Mapped: 114188 kB' 'AnonPages: 299352 kB' 'Shmem: 4820440 kB' 'KernelStack: 12840 kB' 'PageTables: 4744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347864 kB' 'Slab: 662120 kB' 'SReclaimable: 347864 kB' 'SUnreclaim: 314256 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.725 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.725 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@33 -- # echo 0 00:04:06.726 05:03:37 -- setup/common.sh@33 -- # return 0 00:04:06.726 05:03:37 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:06.726 05:03:37 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:06.726 05:03:37 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:06.726 05:03:37 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:06.726 05:03:37 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:06.726 05:03:37 -- setup/common.sh@18 -- # local node=1 00:04:06.726 05:03:37 -- setup/common.sh@19 -- # local var val 00:04:06.726 05:03:37 -- setup/common.sh@20 -- # local mem_f mem 00:04:06.726 05:03:37 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.726 05:03:37 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:06.726 05:03:37 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:06.726 05:03:37 -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.726 05:03:37 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 17474328 kB' 'MemUsed: 10228820 kB' 'SwapCached: 1600 kB' 'Active: 2638484 kB' 'Inactive: 4335888 kB' 'Active(anon): 2389872 kB' 'Inactive(anon): 4277768 kB' 'Active(file): 248612 kB' 'Inactive(file): 58120 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6711296 kB' 'Mapped: 83728 kB' 'AnonPages: 263240 kB' 'Shmem: 6402964 kB' 'KernelStack: 9352 kB' 'PageTables: 3952 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224388 kB' 'Slab: 576932 kB' 'SReclaimable: 224388 kB' 'SUnreclaim: 352544 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.726 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.726 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.727 05:03:37 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.986 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.986 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.986 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.986 05:03:37 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.986 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.986 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.986 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.986 05:03:37 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.986 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.986 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.986 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.986 05:03:37 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.986 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.986 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.986 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.986 05:03:37 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.986 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.987 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.987 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.987 05:03:37 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.987 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.987 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.987 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.987 05:03:37 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.987 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.987 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.987 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.987 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.987 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.987 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.987 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.987 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.987 05:03:37 -- setup/common.sh@32 -- # continue 00:04:06.987 05:03:37 -- setup/common.sh@31 -- # IFS=': ' 00:04:06.987 05:03:37 -- setup/common.sh@31 -- # read -r var val _ 00:04:06.987 05:03:37 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.987 05:03:37 -- setup/common.sh@33 -- # echo 0 00:04:06.987 05:03:37 -- setup/common.sh@33 -- # return 0 00:04:06.987 05:03:37 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:06.987 05:03:37 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:06.987 05:03:37 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:06.987 05:03:37 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:06.987 05:03:37 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:06.987 node0=512 expecting 512 00:04:06.987 05:03:37 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:06.987 05:03:37 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:06.987 05:03:37 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:06.987 05:03:37 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:06.987 node1=512 expecting 512 00:04:06.987 05:03:37 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:06.987 00:04:06.987 real 0m3.765s 00:04:06.987 user 0m1.408s 00:04:06.987 sys 0m2.423s 00:04:06.987 05:03:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:06.987 05:03:37 -- common/autotest_common.sh@10 -- # set +x 00:04:06.987 ************************************ 00:04:06.987 END TEST even_2G_alloc 00:04:06.987 ************************************ 00:04:06.987 05:03:37 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:06.987 05:03:37 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:06.987 05:03:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:06.987 05:03:37 -- common/autotest_common.sh@10 -- # set +x 00:04:06.987 ************************************ 00:04:06.987 START TEST odd_alloc 00:04:06.987 ************************************ 00:04:06.987 05:03:37 -- common/autotest_common.sh@1104 -- # odd_alloc 00:04:06.987 05:03:37 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:06.987 05:03:37 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:06.987 05:03:37 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:06.987 05:03:37 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:06.987 05:03:37 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:06.987 05:03:37 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:06.987 05:03:37 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:06.987 05:03:37 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:06.987 05:03:37 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:06.987 05:03:37 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:06.987 05:03:37 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:06.987 05:03:37 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:06.987 05:03:37 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:06.987 05:03:37 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:06.987 05:03:37 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:06.987 05:03:37 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:06.987 05:03:37 -- setup/hugepages.sh@83 -- # : 513 00:04:06.987 05:03:37 -- setup/hugepages.sh@84 -- # : 1 00:04:06.987 05:03:37 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:06.987 05:03:37 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:06.987 05:03:37 -- setup/hugepages.sh@83 -- # : 0 00:04:06.987 05:03:37 -- setup/hugepages.sh@84 -- # : 0 00:04:06.987 05:03:37 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:06.987 05:03:37 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:06.987 05:03:37 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:06.987 05:03:37 -- setup/hugepages.sh@160 -- # setup output 00:04:06.987 05:03:37 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:06.987 05:03:37 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:10.280 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:10.280 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:10.544 05:03:41 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:10.544 05:03:41 -- setup/hugepages.sh@89 -- # local node 00:04:10.544 05:03:41 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:10.544 05:03:41 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:10.544 05:03:41 -- setup/hugepages.sh@92 -- # local surp 00:04:10.544 05:03:41 -- setup/hugepages.sh@93 -- # local resv 00:04:10.544 05:03:41 -- setup/hugepages.sh@94 -- # local anon 00:04:10.544 05:03:41 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:10.544 05:03:41 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:10.544 05:03:41 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:10.544 05:03:41 -- setup/common.sh@18 -- # local node= 00:04:10.544 05:03:41 -- setup/common.sh@19 -- # local var val 00:04:10.544 05:03:41 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.544 05:03:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.544 05:03:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.544 05:03:41 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.544 05:03:41 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.544 05:03:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40702080 kB' 'MemAvailable: 41333092 kB' 'Buffers: 4304 kB' 'Cached: 12146812 kB' 'SwapCached: 1968 kB' 'Active: 8016960 kB' 'Inactive: 4694792 kB' 'Active(anon): 7505372 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561980 kB' 'Mapped: 198380 kB' 'Shmem: 11223476 kB' 'KReclaimable: 572252 kB' 'Slab: 1240012 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667760 kB' 'KernelStack: 22224 kB' 'PageTables: 8820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 14285040 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216740 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.544 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.544 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.545 05:03:41 -- setup/common.sh@33 -- # echo 0 00:04:10.545 05:03:41 -- setup/common.sh@33 -- # return 0 00:04:10.545 05:03:41 -- setup/hugepages.sh@97 -- # anon=0 00:04:10.545 05:03:41 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:10.545 05:03:41 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:10.545 05:03:41 -- setup/common.sh@18 -- # local node= 00:04:10.545 05:03:41 -- setup/common.sh@19 -- # local var val 00:04:10.545 05:03:41 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.545 05:03:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.545 05:03:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.545 05:03:41 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.545 05:03:41 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.545 05:03:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40700964 kB' 'MemAvailable: 41331976 kB' 'Buffers: 4304 kB' 'Cached: 12146812 kB' 'SwapCached: 1968 kB' 'Active: 8019568 kB' 'Inactive: 4694792 kB' 'Active(anon): 7507980 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564568 kB' 'Mapped: 198428 kB' 'Shmem: 11223476 kB' 'KReclaimable: 572252 kB' 'Slab: 1239928 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667676 kB' 'KernelStack: 22176 kB' 'PageTables: 8648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 14289284 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216692 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.545 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.545 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.546 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.546 05:03:41 -- setup/common.sh@33 -- # echo 0 00:04:10.546 05:03:41 -- setup/common.sh@33 -- # return 0 00:04:10.546 05:03:41 -- setup/hugepages.sh@99 -- # surp=0 00:04:10.546 05:03:41 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:10.546 05:03:41 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:10.546 05:03:41 -- setup/common.sh@18 -- # local node= 00:04:10.546 05:03:41 -- setup/common.sh@19 -- # local var val 00:04:10.546 05:03:41 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.546 05:03:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.546 05:03:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.546 05:03:41 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.546 05:03:41 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.546 05:03:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.546 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40700880 kB' 'MemAvailable: 41331892 kB' 'Buffers: 4304 kB' 'Cached: 12146824 kB' 'SwapCached: 1968 kB' 'Active: 8016420 kB' 'Inactive: 4694792 kB' 'Active(anon): 7504832 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561388 kB' 'Mapped: 197924 kB' 'Shmem: 11223488 kB' 'KReclaimable: 572252 kB' 'Slab: 1239928 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667676 kB' 'KernelStack: 22160 kB' 'PageTables: 8604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 14285436 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216676 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.547 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.547 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.548 05:03:41 -- setup/common.sh@33 -- # echo 0 00:04:10.548 05:03:41 -- setup/common.sh@33 -- # return 0 00:04:10.548 05:03:41 -- setup/hugepages.sh@100 -- # resv=0 00:04:10.548 05:03:41 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:10.548 nr_hugepages=1025 00:04:10.548 05:03:41 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:10.548 resv_hugepages=0 00:04:10.548 05:03:41 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:10.548 surplus_hugepages=0 00:04:10.548 05:03:41 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:10.548 anon_hugepages=0 00:04:10.548 05:03:41 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:10.548 05:03:41 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:10.548 05:03:41 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:10.548 05:03:41 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:10.548 05:03:41 -- setup/common.sh@18 -- # local node= 00:04:10.548 05:03:41 -- setup/common.sh@19 -- # local var val 00:04:10.548 05:03:41 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.548 05:03:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.548 05:03:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.548 05:03:41 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.548 05:03:41 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.548 05:03:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40700632 kB' 'MemAvailable: 41331644 kB' 'Buffers: 4304 kB' 'Cached: 12146844 kB' 'SwapCached: 1968 kB' 'Active: 8016576 kB' 'Inactive: 4694792 kB' 'Active(anon): 7504988 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561508 kB' 'Mapped: 197924 kB' 'Shmem: 11223508 kB' 'KReclaimable: 572252 kB' 'Slab: 1239928 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667676 kB' 'KernelStack: 22176 kB' 'PageTables: 8648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 14285452 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216676 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.548 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.548 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.549 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.549 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.550 05:03:41 -- setup/common.sh@33 -- # echo 1025 00:04:10.550 05:03:41 -- setup/common.sh@33 -- # return 0 00:04:10.550 05:03:41 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:10.550 05:03:41 -- setup/hugepages.sh@112 -- # get_nodes 00:04:10.550 05:03:41 -- setup/hugepages.sh@27 -- # local node 00:04:10.550 05:03:41 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:10.550 05:03:41 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:10.550 05:03:41 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:10.550 05:03:41 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:10.550 05:03:41 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:10.550 05:03:41 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:10.550 05:03:41 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:10.550 05:03:41 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:10.550 05:03:41 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:10.550 05:03:41 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:10.550 05:03:41 -- setup/common.sh@18 -- # local node=0 00:04:10.550 05:03:41 -- setup/common.sh@19 -- # local var val 00:04:10.550 05:03:41 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.550 05:03:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.550 05:03:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:10.550 05:03:41 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:10.550 05:03:41 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.550 05:03:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 23213096 kB' 'MemUsed: 9378988 kB' 'SwapCached: 368 kB' 'Active: 5378720 kB' 'Inactive: 358904 kB' 'Active(anon): 5115744 kB' 'Inactive(anon): 972 kB' 'Active(file): 262976 kB' 'Inactive(file): 357932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5441768 kB' 'Mapped: 114188 kB' 'AnonPages: 299012 kB' 'Shmem: 4820492 kB' 'KernelStack: 12824 kB' 'PageTables: 4692 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347864 kB' 'Slab: 662704 kB' 'SReclaimable: 347864 kB' 'SUnreclaim: 314840 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.550 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.550 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@33 -- # echo 0 00:04:10.551 05:03:41 -- setup/common.sh@33 -- # return 0 00:04:10.551 05:03:41 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:10.551 05:03:41 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:10.551 05:03:41 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:10.551 05:03:41 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:10.551 05:03:41 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:10.551 05:03:41 -- setup/common.sh@18 -- # local node=1 00:04:10.551 05:03:41 -- setup/common.sh@19 -- # local var val 00:04:10.551 05:03:41 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.551 05:03:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.551 05:03:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:10.551 05:03:41 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:10.551 05:03:41 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.551 05:03:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 17488224 kB' 'MemUsed: 10214924 kB' 'SwapCached: 1600 kB' 'Active: 2637884 kB' 'Inactive: 4335888 kB' 'Active(anon): 2389272 kB' 'Inactive(anon): 4277768 kB' 'Active(file): 248612 kB' 'Inactive(file): 58120 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6711376 kB' 'Mapped: 83736 kB' 'AnonPages: 262492 kB' 'Shmem: 6403044 kB' 'KernelStack: 9352 kB' 'PageTables: 3956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224388 kB' 'Slab: 577224 kB' 'SReclaimable: 224388 kB' 'SUnreclaim: 352836 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.551 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.551 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # continue 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.552 05:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.552 05:03:41 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.552 05:03:41 -- setup/common.sh@33 -- # echo 0 00:04:10.552 05:03:41 -- setup/common.sh@33 -- # return 0 00:04:10.552 05:03:41 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:10.552 05:03:41 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:10.552 05:03:41 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:10.552 05:03:41 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:10.552 05:03:41 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:10.552 node0=512 expecting 513 00:04:10.552 05:03:41 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:10.552 05:03:41 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:10.552 05:03:41 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:10.552 05:03:41 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:10.552 node1=513 expecting 512 00:04:10.552 05:03:41 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:10.552 00:04:10.552 real 0m3.756s 00:04:10.552 user 0m1.388s 00:04:10.552 sys 0m2.442s 00:04:10.552 05:03:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:10.552 05:03:41 -- common/autotest_common.sh@10 -- # set +x 00:04:10.552 ************************************ 00:04:10.552 END TEST odd_alloc 00:04:10.552 ************************************ 00:04:10.812 05:03:41 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:10.812 05:03:41 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:10.812 05:03:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:10.812 05:03:41 -- common/autotest_common.sh@10 -- # set +x 00:04:10.812 ************************************ 00:04:10.812 START TEST custom_alloc 00:04:10.812 ************************************ 00:04:10.812 05:03:41 -- common/autotest_common.sh@1104 -- # custom_alloc 00:04:10.812 05:03:41 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:10.812 05:03:41 -- setup/hugepages.sh@169 -- # local node 00:04:10.812 05:03:41 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:10.812 05:03:41 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:10.812 05:03:41 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:10.812 05:03:41 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:10.812 05:03:41 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:10.812 05:03:41 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:10.812 05:03:41 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:10.812 05:03:41 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:10.812 05:03:41 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:10.812 05:03:41 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:10.812 05:03:41 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:10.812 05:03:41 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:10.812 05:03:41 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:10.812 05:03:41 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:10.812 05:03:41 -- setup/hugepages.sh@83 -- # : 256 00:04:10.812 05:03:41 -- setup/hugepages.sh@84 -- # : 1 00:04:10.812 05:03:41 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:10.812 05:03:41 -- setup/hugepages.sh@83 -- # : 0 00:04:10.812 05:03:41 -- setup/hugepages.sh@84 -- # : 0 00:04:10.812 05:03:41 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:10.812 05:03:41 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:10.812 05:03:41 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:10.812 05:03:41 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:10.812 05:03:41 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:10.812 05:03:41 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:10.812 05:03:41 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:10.812 05:03:41 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:10.812 05:03:41 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:10.812 05:03:41 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:10.812 05:03:41 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:10.812 05:03:41 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:10.812 05:03:41 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:10.812 05:03:41 -- setup/hugepages.sh@78 -- # return 0 00:04:10.812 05:03:41 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:10.812 05:03:41 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:10.812 05:03:41 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:10.812 05:03:41 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:10.812 05:03:41 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:10.812 05:03:41 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:10.812 05:03:41 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:10.812 05:03:41 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:10.812 05:03:41 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:10.812 05:03:41 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:10.812 05:03:41 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:10.812 05:03:41 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:10.812 05:03:41 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:10.812 05:03:41 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:10.812 05:03:41 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:10.812 05:03:41 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:10.812 05:03:41 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:10.812 05:03:41 -- setup/hugepages.sh@78 -- # return 0 00:04:10.812 05:03:41 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:10.812 05:03:41 -- setup/hugepages.sh@187 -- # setup output 00:04:10.812 05:03:41 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:10.812 05:03:41 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:14.117 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:14.117 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:14.117 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:14.117 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:14.117 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:14.117 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:14.117 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:14.117 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:14.117 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:14.117 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:14.117 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:14.117 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:14.118 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:14.118 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:14.118 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:14.118 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:14.118 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:14.118 05:03:45 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:14.379 05:03:45 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:14.379 05:03:45 -- setup/hugepages.sh@89 -- # local node 00:04:14.379 05:03:45 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:14.379 05:03:45 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:14.379 05:03:45 -- setup/hugepages.sh@92 -- # local surp 00:04:14.379 05:03:45 -- setup/hugepages.sh@93 -- # local resv 00:04:14.379 05:03:45 -- setup/hugepages.sh@94 -- # local anon 00:04:14.379 05:03:45 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:14.379 05:03:45 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:14.379 05:03:45 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:14.379 05:03:45 -- setup/common.sh@18 -- # local node= 00:04:14.379 05:03:45 -- setup/common.sh@19 -- # local var val 00:04:14.379 05:03:45 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.379 05:03:45 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.379 05:03:45 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.379 05:03:45 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.379 05:03:45 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.379 05:03:45 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 39678976 kB' 'MemAvailable: 40309988 kB' 'Buffers: 4304 kB' 'Cached: 12146956 kB' 'SwapCached: 1968 kB' 'Active: 8017276 kB' 'Inactive: 4694792 kB' 'Active(anon): 7505688 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562092 kB' 'Mapped: 197968 kB' 'Shmem: 11223620 kB' 'KReclaimable: 572252 kB' 'Slab: 1239428 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667176 kB' 'KernelStack: 22144 kB' 'PageTables: 8516 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 14286264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216676 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.379 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.379 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.380 05:03:45 -- setup/common.sh@33 -- # echo 0 00:04:14.380 05:03:45 -- setup/common.sh@33 -- # return 0 00:04:14.380 05:03:45 -- setup/hugepages.sh@97 -- # anon=0 00:04:14.380 05:03:45 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:14.380 05:03:45 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.380 05:03:45 -- setup/common.sh@18 -- # local node= 00:04:14.380 05:03:45 -- setup/common.sh@19 -- # local var val 00:04:14.380 05:03:45 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.380 05:03:45 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.380 05:03:45 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.380 05:03:45 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.380 05:03:45 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.380 05:03:45 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 39679984 kB' 'MemAvailable: 40310996 kB' 'Buffers: 4304 kB' 'Cached: 12146956 kB' 'SwapCached: 1968 kB' 'Active: 8016996 kB' 'Inactive: 4694792 kB' 'Active(anon): 7505408 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561820 kB' 'Mapped: 197928 kB' 'Shmem: 11223620 kB' 'KReclaimable: 572252 kB' 'Slab: 1239428 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667176 kB' 'KernelStack: 22128 kB' 'PageTables: 8460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 14286276 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216644 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.380 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.380 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.381 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.381 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.382 05:03:45 -- setup/common.sh@33 -- # echo 0 00:04:14.382 05:03:45 -- setup/common.sh@33 -- # return 0 00:04:14.382 05:03:45 -- setup/hugepages.sh@99 -- # surp=0 00:04:14.382 05:03:45 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:14.382 05:03:45 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:14.382 05:03:45 -- setup/common.sh@18 -- # local node= 00:04:14.382 05:03:45 -- setup/common.sh@19 -- # local var val 00:04:14.382 05:03:45 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.382 05:03:45 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.382 05:03:45 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.382 05:03:45 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.382 05:03:45 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.382 05:03:45 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 39681000 kB' 'MemAvailable: 40312012 kB' 'Buffers: 4304 kB' 'Cached: 12146960 kB' 'SwapCached: 1968 kB' 'Active: 8016660 kB' 'Inactive: 4694792 kB' 'Active(anon): 7505072 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561476 kB' 'Mapped: 197928 kB' 'Shmem: 11223624 kB' 'KReclaimable: 572252 kB' 'Slab: 1239436 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667184 kB' 'KernelStack: 22128 kB' 'PageTables: 8472 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 14286292 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216644 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.382 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.382 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.383 05:03:45 -- setup/common.sh@33 -- # echo 0 00:04:14.383 05:03:45 -- setup/common.sh@33 -- # return 0 00:04:14.383 05:03:45 -- setup/hugepages.sh@100 -- # resv=0 00:04:14.383 05:03:45 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:14.383 nr_hugepages=1536 00:04:14.383 05:03:45 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:14.383 resv_hugepages=0 00:04:14.383 05:03:45 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:14.383 surplus_hugepages=0 00:04:14.383 05:03:45 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:14.383 anon_hugepages=0 00:04:14.383 05:03:45 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:14.383 05:03:45 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:14.383 05:03:45 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:14.383 05:03:45 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:14.383 05:03:45 -- setup/common.sh@18 -- # local node= 00:04:14.383 05:03:45 -- setup/common.sh@19 -- # local var val 00:04:14.383 05:03:45 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.383 05:03:45 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.383 05:03:45 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.383 05:03:45 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.383 05:03:45 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.383 05:03:45 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 39681364 kB' 'MemAvailable: 40312376 kB' 'Buffers: 4304 kB' 'Cached: 12147000 kB' 'SwapCached: 1968 kB' 'Active: 8017168 kB' 'Inactive: 4694792 kB' 'Active(anon): 7505580 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561892 kB' 'Mapped: 197928 kB' 'Shmem: 11223664 kB' 'KReclaimable: 572252 kB' 'Slab: 1239436 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667184 kB' 'KernelStack: 22160 kB' 'PageTables: 8588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 14286676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216692 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.383 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.383 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.384 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.384 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.385 05:03:45 -- setup/common.sh@33 -- # echo 1536 00:04:14.385 05:03:45 -- setup/common.sh@33 -- # return 0 00:04:14.385 05:03:45 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:14.385 05:03:45 -- setup/hugepages.sh@112 -- # get_nodes 00:04:14.385 05:03:45 -- setup/hugepages.sh@27 -- # local node 00:04:14.385 05:03:45 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.385 05:03:45 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:14.385 05:03:45 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.385 05:03:45 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:14.385 05:03:45 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:14.385 05:03:45 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:14.385 05:03:45 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:14.385 05:03:45 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:14.385 05:03:45 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:14.385 05:03:45 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.385 05:03:45 -- setup/common.sh@18 -- # local node=0 00:04:14.385 05:03:45 -- setup/common.sh@19 -- # local var val 00:04:14.385 05:03:45 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.385 05:03:45 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.385 05:03:45 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:14.385 05:03:45 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:14.385 05:03:45 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.385 05:03:45 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 23224868 kB' 'MemUsed: 9367216 kB' 'SwapCached: 368 kB' 'Active: 5378600 kB' 'Inactive: 358904 kB' 'Active(anon): 5115624 kB' 'Inactive(anon): 972 kB' 'Active(file): 262976 kB' 'Inactive(file): 357932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5441804 kB' 'Mapped: 114188 kB' 'AnonPages: 298796 kB' 'Shmem: 4820528 kB' 'KernelStack: 12792 kB' 'PageTables: 4640 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347864 kB' 'Slab: 662200 kB' 'SReclaimable: 347864 kB' 'SUnreclaim: 314336 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.385 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.385 05:03:45 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@33 -- # echo 0 00:04:14.386 05:03:45 -- setup/common.sh@33 -- # return 0 00:04:14.386 05:03:45 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:14.386 05:03:45 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:14.386 05:03:45 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:14.386 05:03:45 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:14.386 05:03:45 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.386 05:03:45 -- setup/common.sh@18 -- # local node=1 00:04:14.386 05:03:45 -- setup/common.sh@19 -- # local var val 00:04:14.386 05:03:45 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.386 05:03:45 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.386 05:03:45 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:14.386 05:03:45 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:14.386 05:03:45 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.386 05:03:45 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16457800 kB' 'MemUsed: 11245348 kB' 'SwapCached: 1600 kB' 'Active: 2638984 kB' 'Inactive: 4335888 kB' 'Active(anon): 2390372 kB' 'Inactive(anon): 4277768 kB' 'Active(file): 248612 kB' 'Inactive(file): 58120 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6711484 kB' 'Mapped: 83740 kB' 'AnonPages: 263500 kB' 'Shmem: 6403152 kB' 'KernelStack: 9384 kB' 'PageTables: 4004 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224388 kB' 'Slab: 577236 kB' 'SReclaimable: 224388 kB' 'SUnreclaim: 352848 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.386 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.386 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # continue 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.387 05:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.387 05:03:45 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.387 05:03:45 -- setup/common.sh@33 -- # echo 0 00:04:14.387 05:03:45 -- setup/common.sh@33 -- # return 0 00:04:14.387 05:03:45 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:14.387 05:03:45 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:14.387 05:03:45 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:14.387 05:03:45 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:14.387 05:03:45 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:14.387 node0=512 expecting 512 00:04:14.387 05:03:45 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:14.387 05:03:45 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:14.387 05:03:45 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:14.387 05:03:45 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:14.387 node1=1024 expecting 1024 00:04:14.387 05:03:45 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:14.387 00:04:14.387 real 0m3.732s 00:04:14.387 user 0m1.364s 00:04:14.387 sys 0m2.436s 00:04:14.387 05:03:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:14.387 05:03:45 -- common/autotest_common.sh@10 -- # set +x 00:04:14.387 ************************************ 00:04:14.387 END TEST custom_alloc 00:04:14.387 ************************************ 00:04:14.387 05:03:45 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:14.387 05:03:45 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:14.387 05:03:45 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:14.387 05:03:45 -- common/autotest_common.sh@10 -- # set +x 00:04:14.387 ************************************ 00:04:14.387 START TEST no_shrink_alloc 00:04:14.387 ************************************ 00:04:14.387 05:03:45 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:04:14.387 05:03:45 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:14.387 05:03:45 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:14.387 05:03:45 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:14.387 05:03:45 -- setup/hugepages.sh@51 -- # shift 00:04:14.387 05:03:45 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:14.387 05:03:45 -- setup/hugepages.sh@52 -- # local node_ids 00:04:14.387 05:03:45 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:14.387 05:03:45 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:14.387 05:03:45 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:14.387 05:03:45 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:14.387 05:03:45 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:14.387 05:03:45 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:14.387 05:03:45 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:14.387 05:03:45 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:14.387 05:03:45 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:14.387 05:03:45 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:14.387 05:03:45 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:14.387 05:03:45 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:14.387 05:03:45 -- setup/hugepages.sh@73 -- # return 0 00:04:14.387 05:03:45 -- setup/hugepages.sh@198 -- # setup output 00:04:14.387 05:03:45 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:14.387 05:03:45 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:17.722 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:17.722 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:17.722 05:03:48 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:17.722 05:03:48 -- setup/hugepages.sh@89 -- # local node 00:04:17.722 05:03:48 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:17.722 05:03:48 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:17.722 05:03:48 -- setup/hugepages.sh@92 -- # local surp 00:04:17.722 05:03:48 -- setup/hugepages.sh@93 -- # local resv 00:04:17.722 05:03:48 -- setup/hugepages.sh@94 -- # local anon 00:04:17.722 05:03:48 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:17.722 05:03:48 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:17.722 05:03:48 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:17.722 05:03:48 -- setup/common.sh@18 -- # local node= 00:04:17.722 05:03:48 -- setup/common.sh@19 -- # local var val 00:04:17.722 05:03:48 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.722 05:03:48 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.722 05:03:48 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.722 05:03:48 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.722 05:03:48 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.722 05:03:48 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40758252 kB' 'MemAvailable: 41389264 kB' 'Buffers: 4304 kB' 'Cached: 12147076 kB' 'SwapCached: 1968 kB' 'Active: 8017752 kB' 'Inactive: 4694792 kB' 'Active(anon): 7506164 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562624 kB' 'Mapped: 197872 kB' 'Shmem: 11223740 kB' 'KReclaimable: 572252 kB' 'Slab: 1239044 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666792 kB' 'KernelStack: 22176 kB' 'PageTables: 8604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14287140 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216692 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.722 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.722 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.723 05:03:48 -- setup/common.sh@33 -- # echo 0 00:04:17.723 05:03:48 -- setup/common.sh@33 -- # return 0 00:04:17.723 05:03:48 -- setup/hugepages.sh@97 -- # anon=0 00:04:17.723 05:03:48 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:17.723 05:03:48 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.723 05:03:48 -- setup/common.sh@18 -- # local node= 00:04:17.723 05:03:48 -- setup/common.sh@19 -- # local var val 00:04:17.723 05:03:48 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.723 05:03:48 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.723 05:03:48 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.723 05:03:48 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.723 05:03:48 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.723 05:03:48 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40767332 kB' 'MemAvailable: 41398344 kB' 'Buffers: 4304 kB' 'Cached: 12147076 kB' 'SwapCached: 1968 kB' 'Active: 8017920 kB' 'Inactive: 4694792 kB' 'Active(anon): 7506332 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562664 kB' 'Mapped: 197932 kB' 'Shmem: 11223740 kB' 'KReclaimable: 572252 kB' 'Slab: 1239096 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666844 kB' 'KernelStack: 22176 kB' 'PageTables: 8636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14287152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216660 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.723 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.723 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.724 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.724 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.725 05:03:48 -- setup/common.sh@33 -- # echo 0 00:04:17.725 05:03:48 -- setup/common.sh@33 -- # return 0 00:04:17.725 05:03:48 -- setup/hugepages.sh@99 -- # surp=0 00:04:17.725 05:03:48 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:17.725 05:03:48 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:17.725 05:03:48 -- setup/common.sh@18 -- # local node= 00:04:17.725 05:03:48 -- setup/common.sh@19 -- # local var val 00:04:17.725 05:03:48 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.725 05:03:48 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.725 05:03:48 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.725 05:03:48 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.725 05:03:48 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.725 05:03:48 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40767084 kB' 'MemAvailable: 41398096 kB' 'Buffers: 4304 kB' 'Cached: 12147076 kB' 'SwapCached: 1968 kB' 'Active: 8018508 kB' 'Inactive: 4694792 kB' 'Active(anon): 7506920 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563296 kB' 'Mapped: 197932 kB' 'Shmem: 11223740 kB' 'KReclaimable: 572252 kB' 'Slab: 1239096 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666844 kB' 'KernelStack: 22224 kB' 'PageTables: 8788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14290192 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216660 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.725 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.725 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.726 05:03:48 -- setup/common.sh@33 -- # echo 0 00:04:17.726 05:03:48 -- setup/common.sh@33 -- # return 0 00:04:17.726 05:03:48 -- setup/hugepages.sh@100 -- # resv=0 00:04:17.726 05:03:48 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:17.726 nr_hugepages=1024 00:04:17.726 05:03:48 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:17.726 resv_hugepages=0 00:04:17.726 05:03:48 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:17.726 surplus_hugepages=0 00:04:17.726 05:03:48 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:17.726 anon_hugepages=0 00:04:17.726 05:03:48 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:17.726 05:03:48 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:17.726 05:03:48 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:17.726 05:03:48 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:17.726 05:03:48 -- setup/common.sh@18 -- # local node= 00:04:17.726 05:03:48 -- setup/common.sh@19 -- # local var val 00:04:17.726 05:03:48 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.726 05:03:48 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.726 05:03:48 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.726 05:03:48 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.726 05:03:48 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.726 05:03:48 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40767008 kB' 'MemAvailable: 41398020 kB' 'Buffers: 4304 kB' 'Cached: 12147104 kB' 'SwapCached: 1968 kB' 'Active: 8018432 kB' 'Inactive: 4694792 kB' 'Active(anon): 7506844 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563288 kB' 'Mapped: 197992 kB' 'Shmem: 11223768 kB' 'KReclaimable: 572252 kB' 'Slab: 1239092 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 666840 kB' 'KernelStack: 22192 kB' 'PageTables: 8692 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14290572 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216644 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.726 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.726 05:03:48 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.727 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.727 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.728 05:03:48 -- setup/common.sh@33 -- # echo 1024 00:04:17.728 05:03:48 -- setup/common.sh@33 -- # return 0 00:04:17.728 05:03:48 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:17.728 05:03:48 -- setup/hugepages.sh@112 -- # get_nodes 00:04:17.728 05:03:48 -- setup/hugepages.sh@27 -- # local node 00:04:17.728 05:03:48 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.728 05:03:48 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:17.728 05:03:48 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.728 05:03:48 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:17.728 05:03:48 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:17.728 05:03:48 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:17.728 05:03:48 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:17.728 05:03:48 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:17.728 05:03:48 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:17.728 05:03:48 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.728 05:03:48 -- setup/common.sh@18 -- # local node=0 00:04:17.728 05:03:48 -- setup/common.sh@19 -- # local var val 00:04:17.728 05:03:48 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.728 05:03:48 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.728 05:03:48 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:17.728 05:03:48 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:17.728 05:03:48 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.728 05:03:48 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 22207788 kB' 'MemUsed: 10384296 kB' 'SwapCached: 368 kB' 'Active: 5379100 kB' 'Inactive: 358904 kB' 'Active(anon): 5116124 kB' 'Inactive(anon): 972 kB' 'Active(file): 262976 kB' 'Inactive(file): 357932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5441812 kB' 'Mapped: 114188 kB' 'AnonPages: 299496 kB' 'Shmem: 4820536 kB' 'KernelStack: 12936 kB' 'PageTables: 4768 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347864 kB' 'Slab: 661804 kB' 'SReclaimable: 347864 kB' 'SUnreclaim: 313940 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.728 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.728 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # continue 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.729 05:03:48 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.729 05:03:48 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.729 05:03:48 -- setup/common.sh@33 -- # echo 0 00:04:17.729 05:03:48 -- setup/common.sh@33 -- # return 0 00:04:17.729 05:03:48 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:17.729 05:03:48 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:17.729 05:03:48 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:17.729 05:03:48 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:17.729 05:03:48 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:17.729 node0=1024 expecting 1024 00:04:17.729 05:03:48 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:17.729 05:03:48 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:17.729 05:03:48 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:17.729 05:03:48 -- setup/hugepages.sh@202 -- # setup output 00:04:17.729 05:03:48 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.729 05:03:48 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:21.024 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:21.024 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:21.287 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:21.287 05:03:52 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:21.287 05:03:52 -- setup/hugepages.sh@89 -- # local node 00:04:21.287 05:03:52 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:21.287 05:03:52 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:21.287 05:03:52 -- setup/hugepages.sh@92 -- # local surp 00:04:21.287 05:03:52 -- setup/hugepages.sh@93 -- # local resv 00:04:21.287 05:03:52 -- setup/hugepages.sh@94 -- # local anon 00:04:21.287 05:03:52 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:21.287 05:03:52 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:21.287 05:03:52 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:21.287 05:03:52 -- setup/common.sh@18 -- # local node= 00:04:21.287 05:03:52 -- setup/common.sh@19 -- # local var val 00:04:21.287 05:03:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.287 05:03:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.287 05:03:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.287 05:03:52 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.287 05:03:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.287 05:03:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40738580 kB' 'MemAvailable: 41369592 kB' 'Buffers: 4304 kB' 'Cached: 12147196 kB' 'SwapCached: 1968 kB' 'Active: 8020164 kB' 'Inactive: 4694792 kB' 'Active(anon): 7508576 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564316 kB' 'Mapped: 198024 kB' 'Shmem: 11223860 kB' 'KReclaimable: 572252 kB' 'Slab: 1239552 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667300 kB' 'KernelStack: 22192 kB' 'PageTables: 8596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14287416 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216772 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.287 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.288 05:03:52 -- setup/common.sh@33 -- # echo 0 00:04:21.288 05:03:52 -- setup/common.sh@33 -- # return 0 00:04:21.288 05:03:52 -- setup/hugepages.sh@97 -- # anon=0 00:04:21.288 05:03:52 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:21.288 05:03:52 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.288 05:03:52 -- setup/common.sh@18 -- # local node= 00:04:21.288 05:03:52 -- setup/common.sh@19 -- # local var val 00:04:21.288 05:03:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.288 05:03:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.288 05:03:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.288 05:03:52 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.288 05:03:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.288 05:03:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40739356 kB' 'MemAvailable: 41370368 kB' 'Buffers: 4304 kB' 'Cached: 12147200 kB' 'SwapCached: 1968 kB' 'Active: 8019380 kB' 'Inactive: 4694792 kB' 'Active(anon): 7507792 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563512 kB' 'Mapped: 198028 kB' 'Shmem: 11223864 kB' 'KReclaimable: 572252 kB' 'Slab: 1239576 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667324 kB' 'KernelStack: 22160 kB' 'PageTables: 8528 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14287564 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216708 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.288 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.288 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.289 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.289 05:03:52 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.289 05:03:52 -- setup/common.sh@33 -- # echo 0 00:04:21.290 05:03:52 -- setup/common.sh@33 -- # return 0 00:04:21.290 05:03:52 -- setup/hugepages.sh@99 -- # surp=0 00:04:21.290 05:03:52 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:21.290 05:03:52 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:21.290 05:03:52 -- setup/common.sh@18 -- # local node= 00:04:21.290 05:03:52 -- setup/common.sh@19 -- # local var val 00:04:21.290 05:03:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.290 05:03:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.290 05:03:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.290 05:03:52 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.290 05:03:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.290 05:03:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40739492 kB' 'MemAvailable: 41370504 kB' 'Buffers: 4304 kB' 'Cached: 12147216 kB' 'SwapCached: 1968 kB' 'Active: 8018900 kB' 'Inactive: 4694792 kB' 'Active(anon): 7507312 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563528 kB' 'Mapped: 197952 kB' 'Shmem: 11223880 kB' 'KReclaimable: 572252 kB' 'Slab: 1239580 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667328 kB' 'KernelStack: 22176 kB' 'PageTables: 8572 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14287584 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216708 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.290 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.290 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.291 05:03:52 -- setup/common.sh@33 -- # echo 0 00:04:21.291 05:03:52 -- setup/common.sh@33 -- # return 0 00:04:21.291 05:03:52 -- setup/hugepages.sh@100 -- # resv=0 00:04:21.291 05:03:52 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:21.291 nr_hugepages=1024 00:04:21.291 05:03:52 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:21.291 resv_hugepages=0 00:04:21.291 05:03:52 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:21.291 surplus_hugepages=0 00:04:21.291 05:03:52 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:21.291 anon_hugepages=0 00:04:21.291 05:03:52 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:21.291 05:03:52 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:21.291 05:03:52 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:21.291 05:03:52 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:21.291 05:03:52 -- setup/common.sh@18 -- # local node= 00:04:21.291 05:03:52 -- setup/common.sh@19 -- # local var val 00:04:21.291 05:03:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.291 05:03:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.291 05:03:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.291 05:03:52 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.291 05:03:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.291 05:03:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40740404 kB' 'MemAvailable: 41371416 kB' 'Buffers: 4304 kB' 'Cached: 12147248 kB' 'SwapCached: 1968 kB' 'Active: 8018880 kB' 'Inactive: 4694792 kB' 'Active(anon): 7507292 kB' 'Inactive(anon): 4278740 kB' 'Active(file): 511588 kB' 'Inactive(file): 416052 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7321340 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563460 kB' 'Mapped: 197952 kB' 'Shmem: 11223912 kB' 'KReclaimable: 572252 kB' 'Slab: 1239580 kB' 'SReclaimable: 572252 kB' 'SUnreclaim: 667328 kB' 'KernelStack: 22176 kB' 'PageTables: 8616 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 14300036 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216708 kB' 'VmallocChunk: 0 kB' 'Percpu: 112896 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3218804 kB' 'DirectMap2M: 52041728 kB' 'DirectMap1G: 13631488 kB' 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.291 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.291 05:03:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.292 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.292 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.293 05:03:52 -- setup/common.sh@33 -- # echo 1024 00:04:21.293 05:03:52 -- setup/common.sh@33 -- # return 0 00:04:21.293 05:03:52 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:21.293 05:03:52 -- setup/hugepages.sh@112 -- # get_nodes 00:04:21.293 05:03:52 -- setup/hugepages.sh@27 -- # local node 00:04:21.293 05:03:52 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.293 05:03:52 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:21.293 05:03:52 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.293 05:03:52 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:21.293 05:03:52 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:21.293 05:03:52 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:21.293 05:03:52 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:21.293 05:03:52 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:21.293 05:03:52 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:21.293 05:03:52 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.293 05:03:52 -- setup/common.sh@18 -- # local node=0 00:04:21.293 05:03:52 -- setup/common.sh@19 -- # local var val 00:04:21.293 05:03:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:21.293 05:03:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.293 05:03:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:21.293 05:03:52 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:21.293 05:03:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.293 05:03:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 22199432 kB' 'MemUsed: 10392652 kB' 'SwapCached: 368 kB' 'Active: 5379760 kB' 'Inactive: 358904 kB' 'Active(anon): 5116784 kB' 'Inactive(anon): 972 kB' 'Active(file): 262976 kB' 'Inactive(file): 357932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5441864 kB' 'Mapped: 114188 kB' 'AnonPages: 299992 kB' 'Shmem: 4820588 kB' 'KernelStack: 12824 kB' 'PageTables: 4648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 347864 kB' 'Slab: 661924 kB' 'SReclaimable: 347864 kB' 'SUnreclaim: 314060 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.293 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.293 05:03:52 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # continue 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:21.294 05:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:21.294 05:03:52 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.294 05:03:52 -- setup/common.sh@33 -- # echo 0 00:04:21.294 05:03:52 -- setup/common.sh@33 -- # return 0 00:04:21.294 05:03:52 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:21.294 05:03:52 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:21.294 05:03:52 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:21.294 05:03:52 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:21.294 05:03:52 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:21.294 node0=1024 expecting 1024 00:04:21.294 05:03:52 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:21.294 00:04:21.294 real 0m6.906s 00:04:21.294 user 0m2.472s 00:04:21.294 sys 0m4.478s 00:04:21.294 05:03:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:21.294 05:03:52 -- common/autotest_common.sh@10 -- # set +x 00:04:21.294 ************************************ 00:04:21.294 END TEST no_shrink_alloc 00:04:21.294 ************************************ 00:04:21.554 05:03:52 -- setup/hugepages.sh@217 -- # clear_hp 00:04:21.554 05:03:52 -- setup/hugepages.sh@37 -- # local node hp 00:04:21.554 05:03:52 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:21.554 05:03:52 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.554 05:03:52 -- setup/hugepages.sh@41 -- # echo 0 00:04:21.554 05:03:52 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.554 05:03:52 -- setup/hugepages.sh@41 -- # echo 0 00:04:21.554 05:03:52 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:21.554 05:03:52 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.554 05:03:52 -- setup/hugepages.sh@41 -- # echo 0 00:04:21.554 05:03:52 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:21.554 05:03:52 -- setup/hugepages.sh@41 -- # echo 0 00:04:21.554 05:03:52 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:21.554 05:03:52 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:21.554 00:04:21.554 real 0m27.687s 00:04:21.554 user 0m9.571s 00:04:21.554 sys 0m16.966s 00:04:21.554 05:03:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:21.554 05:03:52 -- common/autotest_common.sh@10 -- # set +x 00:04:21.554 ************************************ 00:04:21.554 END TEST hugepages 00:04:21.554 ************************************ 00:04:21.554 05:03:52 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:21.554 05:03:52 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:21.554 05:03:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:21.554 05:03:52 -- common/autotest_common.sh@10 -- # set +x 00:04:21.554 ************************************ 00:04:21.554 START TEST driver 00:04:21.554 ************************************ 00:04:21.554 05:03:52 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:21.554 * Looking for test storage... 00:04:21.554 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:21.554 05:03:52 -- setup/driver.sh@68 -- # setup reset 00:04:21.554 05:03:52 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:21.554 05:03:52 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:26.828 05:03:57 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:26.828 05:03:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:26.828 05:03:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:26.828 05:03:57 -- common/autotest_common.sh@10 -- # set +x 00:04:26.828 ************************************ 00:04:26.828 START TEST guess_driver 00:04:26.828 ************************************ 00:04:26.828 05:03:57 -- common/autotest_common.sh@1104 -- # guess_driver 00:04:26.828 05:03:57 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:26.828 05:03:57 -- setup/driver.sh@47 -- # local fail=0 00:04:26.828 05:03:57 -- setup/driver.sh@49 -- # pick_driver 00:04:26.828 05:03:57 -- setup/driver.sh@36 -- # vfio 00:04:26.828 05:03:57 -- setup/driver.sh@21 -- # local iommu_grups 00:04:26.828 05:03:57 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:26.828 05:03:57 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:26.828 05:03:57 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:26.828 05:03:57 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:26.828 05:03:57 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:26.828 05:03:57 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:26.828 05:03:57 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:26.828 05:03:57 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:26.828 05:03:57 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:26.828 05:03:57 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:26.828 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:26.828 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:26.828 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:26.828 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:26.828 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:26.828 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:26.828 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:26.828 05:03:57 -- setup/driver.sh@30 -- # return 0 00:04:26.828 05:03:57 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:26.828 05:03:57 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:26.828 05:03:57 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:26.828 05:03:57 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:26.828 Looking for driver=vfio-pci 00:04:26.828 05:03:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:26.828 05:03:57 -- setup/driver.sh@45 -- # setup output config 00:04:26.828 05:03:57 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:26.828 05:03:57 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:30.117 05:04:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.117 05:04:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.117 05:04:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.117 05:04:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:31.496 05:04:02 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:31.496 05:04:02 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:31.496 05:04:02 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:31.755 05:04:02 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:31.755 05:04:02 -- setup/driver.sh@65 -- # setup reset 00:04:31.755 05:04:02 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:31.755 05:04:02 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:37.038 00:04:37.038 real 0m10.068s 00:04:37.038 user 0m2.645s 00:04:37.038 sys 0m5.156s 00:04:37.038 05:04:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:37.038 05:04:07 -- common/autotest_common.sh@10 -- # set +x 00:04:37.038 ************************************ 00:04:37.038 END TEST guess_driver 00:04:37.038 ************************************ 00:04:37.038 00:04:37.038 real 0m15.107s 00:04:37.038 user 0m4.061s 00:04:37.038 sys 0m7.928s 00:04:37.038 05:04:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:37.038 05:04:07 -- common/autotest_common.sh@10 -- # set +x 00:04:37.038 ************************************ 00:04:37.038 END TEST driver 00:04:37.038 ************************************ 00:04:37.038 05:04:07 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:37.038 05:04:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:37.038 05:04:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:37.038 05:04:07 -- common/autotest_common.sh@10 -- # set +x 00:04:37.038 ************************************ 00:04:37.038 START TEST devices 00:04:37.038 ************************************ 00:04:37.038 05:04:07 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:37.038 * Looking for test storage... 00:04:37.038 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:37.038 05:04:07 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:37.038 05:04:07 -- setup/devices.sh@192 -- # setup reset 00:04:37.038 05:04:07 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:37.038 05:04:07 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:40.327 05:04:11 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:40.327 05:04:11 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:04:40.327 05:04:11 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:04:40.328 05:04:11 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:04:40.328 05:04:11 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:04:40.328 05:04:11 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:04:40.328 05:04:11 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:04:40.328 05:04:11 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:40.328 05:04:11 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:04:40.328 05:04:11 -- setup/devices.sh@196 -- # blocks=() 00:04:40.328 05:04:11 -- setup/devices.sh@196 -- # declare -a blocks 00:04:40.328 05:04:11 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:40.328 05:04:11 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:40.328 05:04:11 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:40.328 05:04:11 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:40.328 05:04:11 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:40.328 05:04:11 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:40.328 05:04:11 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:40.328 05:04:11 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:40.328 05:04:11 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:40.328 05:04:11 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:40.328 05:04:11 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:40.587 No valid GPT data, bailing 00:04:40.587 05:04:11 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:40.587 05:04:11 -- scripts/common.sh@393 -- # pt= 00:04:40.587 05:04:11 -- scripts/common.sh@394 -- # return 1 00:04:40.587 05:04:11 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:40.587 05:04:11 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:40.587 05:04:11 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:40.587 05:04:11 -- setup/common.sh@80 -- # echo 1600321314816 00:04:40.587 05:04:11 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:40.587 05:04:11 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:40.587 05:04:11 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:40.587 05:04:11 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:40.587 05:04:11 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:40.587 05:04:11 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:40.587 05:04:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:40.587 05:04:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:40.587 05:04:11 -- common/autotest_common.sh@10 -- # set +x 00:04:40.587 ************************************ 00:04:40.587 START TEST nvme_mount 00:04:40.587 ************************************ 00:04:40.587 05:04:11 -- common/autotest_common.sh@1104 -- # nvme_mount 00:04:40.587 05:04:11 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:40.587 05:04:11 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:40.587 05:04:11 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:40.587 05:04:11 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:40.587 05:04:11 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:40.587 05:04:11 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:40.587 05:04:11 -- setup/common.sh@40 -- # local part_no=1 00:04:40.587 05:04:11 -- setup/common.sh@41 -- # local size=1073741824 00:04:40.587 05:04:11 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:40.587 05:04:11 -- setup/common.sh@44 -- # parts=() 00:04:40.587 05:04:11 -- setup/common.sh@44 -- # local parts 00:04:40.587 05:04:11 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:40.587 05:04:11 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:40.587 05:04:11 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:40.587 05:04:11 -- setup/common.sh@46 -- # (( part++ )) 00:04:40.587 05:04:11 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:40.587 05:04:11 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:40.587 05:04:11 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:40.587 05:04:11 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:41.525 Creating new GPT entries in memory. 00:04:41.525 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:41.525 other utilities. 00:04:41.525 05:04:12 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:41.525 05:04:12 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:41.525 05:04:12 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:41.525 05:04:12 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:41.525 05:04:12 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:42.462 Creating new GPT entries in memory. 00:04:42.462 The operation has completed successfully. 00:04:42.462 05:04:13 -- setup/common.sh@57 -- # (( part++ )) 00:04:42.462 05:04:13 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:42.462 05:04:13 -- setup/common.sh@62 -- # wait 3097647 00:04:42.721 05:04:13 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.721 05:04:13 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:42.721 05:04:13 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.721 05:04:13 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:42.721 05:04:13 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:42.721 05:04:13 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.721 05:04:13 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:42.721 05:04:13 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:42.721 05:04:13 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:42.721 05:04:13 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.721 05:04:13 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:42.721 05:04:13 -- setup/devices.sh@53 -- # local found=0 00:04:42.721 05:04:13 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:42.721 05:04:13 -- setup/devices.sh@56 -- # : 00:04:42.721 05:04:13 -- setup/devices.sh@59 -- # local pci status 00:04:42.721 05:04:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.721 05:04:13 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:42.721 05:04:13 -- setup/devices.sh@47 -- # setup output config 00:04:42.721 05:04:13 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:42.721 05:04:13 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:46.014 05:04:16 -- setup/devices.sh@63 -- # found=1 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.014 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.014 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.015 05:04:16 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.015 05:04:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.015 05:04:16 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:46.015 05:04:16 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:46.015 05:04:16 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.015 05:04:16 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:46.015 05:04:16 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.015 05:04:16 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:46.015 05:04:16 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.015 05:04:16 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.015 05:04:16 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:46.015 05:04:16 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:46.015 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:46.015 05:04:16 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:46.015 05:04:16 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:46.274 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:46.274 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:46.274 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:46.274 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:46.274 05:04:17 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:46.274 05:04:17 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:46.274 05:04:17 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.274 05:04:17 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:46.274 05:04:17 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:46.274 05:04:17 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.274 05:04:17 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.274 05:04:17 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:46.274 05:04:17 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:46.274 05:04:17 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.274 05:04:17 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.274 05:04:17 -- setup/devices.sh@53 -- # local found=0 00:04:46.274 05:04:17 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:46.274 05:04:17 -- setup/devices.sh@56 -- # : 00:04:46.274 05:04:17 -- setup/devices.sh@59 -- # local pci status 00:04:46.274 05:04:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.274 05:04:17 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:46.274 05:04:17 -- setup/devices.sh@47 -- # setup output config 00:04:46.274 05:04:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.274 05:04:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:49.568 05:04:20 -- setup/devices.sh@63 -- # found=1 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.568 05:04:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.568 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.828 05:04:20 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:49.828 05:04:20 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:49.828 05:04:20 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:49.828 05:04:20 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:49.828 05:04:20 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:49.828 05:04:20 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:49.828 05:04:20 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:49.828 05:04:20 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:49.828 05:04:20 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:49.828 05:04:20 -- setup/devices.sh@50 -- # local mount_point= 00:04:49.828 05:04:20 -- setup/devices.sh@51 -- # local test_file= 00:04:49.828 05:04:20 -- setup/devices.sh@53 -- # local found=0 00:04:49.828 05:04:20 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:49.828 05:04:20 -- setup/devices.sh@59 -- # local pci status 00:04:49.828 05:04:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.828 05:04:20 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:49.828 05:04:20 -- setup/devices.sh@47 -- # setup output config 00:04:49.828 05:04:20 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:49.828 05:04:20 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:53.125 05:04:24 -- setup/devices.sh@63 -- # found=1 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.125 05:04:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.125 05:04:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.384 05:04:24 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:53.384 05:04:24 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:53.384 05:04:24 -- setup/devices.sh@68 -- # return 0 00:04:53.384 05:04:24 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:53.384 05:04:24 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.384 05:04:24 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:53.384 05:04:24 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:53.384 05:04:24 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:53.384 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:53.384 00:04:53.384 real 0m12.791s 00:04:53.384 user 0m3.681s 00:04:53.384 sys 0m7.067s 00:04:53.384 05:04:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:53.384 05:04:24 -- common/autotest_common.sh@10 -- # set +x 00:04:53.384 ************************************ 00:04:53.384 END TEST nvme_mount 00:04:53.384 ************************************ 00:04:53.384 05:04:24 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:53.384 05:04:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:53.384 05:04:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:53.384 05:04:24 -- common/autotest_common.sh@10 -- # set +x 00:04:53.384 ************************************ 00:04:53.384 START TEST dm_mount 00:04:53.384 ************************************ 00:04:53.384 05:04:24 -- common/autotest_common.sh@1104 -- # dm_mount 00:04:53.384 05:04:24 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:53.384 05:04:24 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:53.384 05:04:24 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:53.385 05:04:24 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:53.385 05:04:24 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:53.385 05:04:24 -- setup/common.sh@40 -- # local part_no=2 00:04:53.385 05:04:24 -- setup/common.sh@41 -- # local size=1073741824 00:04:53.385 05:04:24 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:53.385 05:04:24 -- setup/common.sh@44 -- # parts=() 00:04:53.385 05:04:24 -- setup/common.sh@44 -- # local parts 00:04:53.385 05:04:24 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:53.385 05:04:24 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.385 05:04:24 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:53.385 05:04:24 -- setup/common.sh@46 -- # (( part++ )) 00:04:53.385 05:04:24 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.385 05:04:24 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:53.385 05:04:24 -- setup/common.sh@46 -- # (( part++ )) 00:04:53.385 05:04:24 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.385 05:04:24 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:53.385 05:04:24 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:53.385 05:04:24 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:54.345 Creating new GPT entries in memory. 00:04:54.345 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:54.345 other utilities. 00:04:54.345 05:04:25 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:54.345 05:04:25 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:54.345 05:04:25 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:54.345 05:04:25 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:54.345 05:04:25 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:55.284 Creating new GPT entries in memory. 00:04:55.284 The operation has completed successfully. 00:04:55.284 05:04:26 -- setup/common.sh@57 -- # (( part++ )) 00:04:55.284 05:04:26 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:55.284 05:04:26 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:55.284 05:04:26 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:55.284 05:04:26 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:56.665 The operation has completed successfully. 00:04:56.665 05:04:27 -- setup/common.sh@57 -- # (( part++ )) 00:04:56.665 05:04:27 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:56.665 05:04:27 -- setup/common.sh@62 -- # wait 3102157 00:04:56.665 05:04:27 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:56.665 05:04:27 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.665 05:04:27 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.665 05:04:27 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:56.665 05:04:27 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:56.665 05:04:27 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.665 05:04:27 -- setup/devices.sh@161 -- # break 00:04:56.665 05:04:27 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.665 05:04:27 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:56.665 05:04:27 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:56.665 05:04:27 -- setup/devices.sh@166 -- # dm=dm-0 00:04:56.665 05:04:27 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:56.665 05:04:27 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:56.665 05:04:27 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.665 05:04:27 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:56.665 05:04:27 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.665 05:04:27 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.665 05:04:27 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:56.665 05:04:27 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.665 05:04:27 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.665 05:04:27 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:56.665 05:04:27 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:56.665 05:04:27 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.665 05:04:27 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.665 05:04:27 -- setup/devices.sh@53 -- # local found=0 00:04:56.665 05:04:27 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:56.665 05:04:27 -- setup/devices.sh@56 -- # : 00:04:56.665 05:04:27 -- setup/devices.sh@59 -- # local pci status 00:04:56.665 05:04:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.665 05:04:27 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:56.665 05:04:27 -- setup/devices.sh@47 -- # setup output config 00:04:56.665 05:04:27 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:56.665 05:04:27 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:59.953 05:04:30 -- setup/devices.sh@63 -- # found=1 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.953 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.953 05:04:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.954 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.954 05:04:30 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:59.954 05:04:30 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:59.954 05:04:30 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:59.954 05:04:30 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:59.954 05:04:30 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:59.954 05:04:30 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:59.954 05:04:30 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:59.954 05:04:30 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:59.954 05:04:30 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:59.954 05:04:30 -- setup/devices.sh@50 -- # local mount_point= 00:04:59.954 05:04:30 -- setup/devices.sh@51 -- # local test_file= 00:04:59.954 05:04:30 -- setup/devices.sh@53 -- # local found=0 00:04:59.954 05:04:30 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:59.954 05:04:30 -- setup/devices.sh@59 -- # local pci status 00:04:59.954 05:04:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.954 05:04:30 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:59.954 05:04:30 -- setup/devices.sh@47 -- # setup output config 00:04:59.954 05:04:30 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:59.954 05:04:30 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:03.244 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.244 05:04:33 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:03.244 05:04:33 -- setup/devices.sh@63 -- # found=1 00:05:03.244 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.244 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.244 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.244 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.244 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.244 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.244 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.244 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.244 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.244 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.244 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.244 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.244 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.244 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.244 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.245 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.245 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.245 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.245 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.245 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.245 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.245 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.245 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.245 05:04:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.245 05:04:33 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:03.245 05:04:33 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:03.245 05:04:33 -- setup/devices.sh@68 -- # return 0 00:05:03.245 05:04:33 -- setup/devices.sh@187 -- # cleanup_dm 00:05:03.245 05:04:33 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:03.245 05:04:33 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:03.245 05:04:33 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:03.245 05:04:33 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:03.245 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:03.245 05:04:33 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:03.245 00:05:03.245 real 0m9.604s 00:05:03.245 user 0m2.171s 00:05:03.245 sys 0m4.370s 00:05:03.245 05:04:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:03.245 05:04:33 -- common/autotest_common.sh@10 -- # set +x 00:05:03.245 ************************************ 00:05:03.245 END TEST dm_mount 00:05:03.245 ************************************ 00:05:03.245 05:04:33 -- setup/devices.sh@1 -- # cleanup 00:05:03.245 05:04:33 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:03.245 05:04:33 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:03.245 05:04:33 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:03.245 05:04:33 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:03.245 05:04:33 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:03.245 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:03.245 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:03.245 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:03.245 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:03.245 05:04:34 -- setup/devices.sh@12 -- # cleanup_dm 00:05:03.245 05:04:34 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:03.245 05:04:34 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:03.245 05:04:34 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.245 05:04:34 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:03.245 05:04:34 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:03.245 05:04:34 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:03.245 00:05:03.245 real 0m26.655s 00:05:03.245 user 0m7.270s 00:05:03.245 sys 0m14.173s 00:05:03.245 05:04:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:03.245 05:04:34 -- common/autotest_common.sh@10 -- # set +x 00:05:03.245 ************************************ 00:05:03.245 END TEST devices 00:05:03.245 ************************************ 00:05:03.245 00:05:03.245 real 1m34.844s 00:05:03.245 user 0m28.923s 00:05:03.245 sys 0m54.690s 00:05:03.245 05:04:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:03.245 05:04:34 -- common/autotest_common.sh@10 -- # set +x 00:05:03.245 ************************************ 00:05:03.245 END TEST setup.sh 00:05:03.245 ************************************ 00:05:03.505 05:04:34 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:06.795 Hugepages 00:05:06.795 node hugesize free / total 00:05:06.795 node0 1048576kB 0 / 0 00:05:06.795 node0 2048kB 2048 / 2048 00:05:06.795 node1 1048576kB 0 / 0 00:05:06.795 node1 2048kB 0 / 0 00:05:06.795 00:05:06.795 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:06.795 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:06.795 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:06.795 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:06.795 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:06.795 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:06.795 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:06.795 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:06.795 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:06.795 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:06.795 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:06.795 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:06.795 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:06.795 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:06.795 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:06.795 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:06.795 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:06.795 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:06.795 05:04:37 -- spdk/autotest.sh@141 -- # uname -s 00:05:06.795 05:04:37 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:05:06.795 05:04:37 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:05:06.795 05:04:37 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:09.332 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:09.332 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:09.332 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:09.332 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:09.332 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:09.332 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:09.332 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:09.332 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:09.332 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:09.592 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:09.592 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:09.592 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:09.592 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:09.592 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:09.592 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:09.592 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:11.498 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:11.498 05:04:42 -- common/autotest_common.sh@1517 -- # sleep 1 00:05:12.478 05:04:43 -- common/autotest_common.sh@1518 -- # bdfs=() 00:05:12.478 05:04:43 -- common/autotest_common.sh@1518 -- # local bdfs 00:05:12.478 05:04:43 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:05:12.478 05:04:43 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:05:12.478 05:04:43 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:12.478 05:04:43 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:12.478 05:04:43 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:12.478 05:04:43 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:12.478 05:04:43 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:12.478 05:04:43 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:12.478 05:04:43 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:12.478 05:04:43 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:15.770 Waiting for block devices as requested 00:05:15.770 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:15.770 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:15.770 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:15.770 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:15.770 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:16.030 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:16.030 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:16.030 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:16.290 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:16.290 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:16.290 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:16.549 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:16.549 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:16.549 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:16.808 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:16.808 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:16.808 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:17.068 05:04:48 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:05:17.068 05:04:48 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:17.068 05:04:48 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:05:17.068 05:04:48 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:05:17.068 05:04:48 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:17.068 05:04:48 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:17.068 05:04:48 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:17.068 05:04:48 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:05:17.068 05:04:48 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:05:17.068 05:04:48 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:05:17.068 05:04:48 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:05:17.068 05:04:48 -- common/autotest_common.sh@1530 -- # grep oacs 00:05:17.068 05:04:48 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:05:17.068 05:04:48 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:05:17.068 05:04:48 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:05:17.068 05:04:48 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:05:17.068 05:04:48 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:05:17.068 05:04:48 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:05:17.068 05:04:48 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:05:17.068 05:04:48 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:05:17.068 05:04:48 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:05:17.068 05:04:48 -- common/autotest_common.sh@1542 -- # continue 00:05:17.068 05:04:48 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:05:17.068 05:04:48 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:17.068 05:04:48 -- common/autotest_common.sh@10 -- # set +x 00:05:17.068 05:04:48 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:05:17.068 05:04:48 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:17.068 05:04:48 -- common/autotest_common.sh@10 -- # set +x 00:05:17.068 05:04:48 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:21.258 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:21.258 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:22.195 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:22.195 05:04:53 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:05:22.195 05:04:53 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:22.195 05:04:53 -- common/autotest_common.sh@10 -- # set +x 00:05:22.195 05:04:53 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:05:22.195 05:04:53 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:05:22.195 05:04:53 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:05:22.195 05:04:53 -- common/autotest_common.sh@1562 -- # bdfs=() 00:05:22.195 05:04:53 -- common/autotest_common.sh@1562 -- # local bdfs 00:05:22.195 05:04:53 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:05:22.195 05:04:53 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:22.195 05:04:53 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:22.195 05:04:53 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:22.195 05:04:53 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:22.195 05:04:53 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:22.453 05:04:53 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:22.453 05:04:53 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:22.453 05:04:53 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:05:22.454 05:04:53 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:22.454 05:04:53 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:05:22.454 05:04:53 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:22.454 05:04:53 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:05:22.454 05:04:53 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:05:22.454 05:04:53 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:05:22.454 05:04:53 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=3112066 00:05:22.454 05:04:53 -- common/autotest_common.sh@1583 -- # waitforlisten 3112066 00:05:22.454 05:04:53 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:22.454 05:04:53 -- common/autotest_common.sh@819 -- # '[' -z 3112066 ']' 00:05:22.454 05:04:53 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:22.454 05:04:53 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:22.454 05:04:53 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:22.454 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:22.454 05:04:53 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:22.454 05:04:53 -- common/autotest_common.sh@10 -- # set +x 00:05:22.454 [2024-07-23 05:04:53.421539] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:05:22.454 [2024-07-23 05:04:53.421607] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3112066 ] 00:05:22.454 EAL: No free 2048 kB hugepages reported on node 1 00:05:22.454 [2024-07-23 05:04:53.518655] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:22.712 [2024-07-23 05:04:53.608265] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:22.712 [2024-07-23 05:04:53.608398] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:23.283 05:04:54 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:23.283 05:04:54 -- common/autotest_common.sh@852 -- # return 0 00:05:23.283 05:04:54 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:05:23.283 05:04:54 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:05:23.283 05:04:54 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:26.575 nvme0n1 00:05:26.575 05:04:57 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:26.575 [2024-07-23 05:04:57.601183] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:26.575 request: 00:05:26.575 { 00:05:26.575 "nvme_ctrlr_name": "nvme0", 00:05:26.575 "password": "test", 00:05:26.575 "method": "bdev_nvme_opal_revert", 00:05:26.575 "req_id": 1 00:05:26.575 } 00:05:26.575 Got JSON-RPC error response 00:05:26.575 response: 00:05:26.575 { 00:05:26.575 "code": -32602, 00:05:26.575 "message": "Invalid parameters" 00:05:26.575 } 00:05:26.575 05:04:57 -- common/autotest_common.sh@1589 -- # true 00:05:26.575 05:04:57 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:05:26.575 05:04:57 -- common/autotest_common.sh@1593 -- # killprocess 3112066 00:05:26.575 05:04:57 -- common/autotest_common.sh@926 -- # '[' -z 3112066 ']' 00:05:26.575 05:04:57 -- common/autotest_common.sh@930 -- # kill -0 3112066 00:05:26.576 05:04:57 -- common/autotest_common.sh@931 -- # uname 00:05:26.576 05:04:57 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:26.576 05:04:57 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3112066 00:05:26.835 05:04:57 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:26.835 05:04:57 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:26.835 05:04:57 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3112066' 00:05:26.835 killing process with pid 3112066 00:05:26.835 05:04:57 -- common/autotest_common.sh@945 -- # kill 3112066 00:05:26.835 05:04:57 -- common/autotest_common.sh@950 -- # wait 3112066 00:05:29.374 05:04:59 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:05:29.374 05:04:59 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:05:29.374 05:04:59 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:29.374 05:04:59 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:29.374 05:04:59 -- spdk/autotest.sh@173 -- # timing_enter lib 00:05:29.374 05:04:59 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:29.374 05:04:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.374 05:04:59 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:29.374 05:04:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:29.374 05:04:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:29.374 05:04:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.374 ************************************ 00:05:29.374 START TEST env 00:05:29.375 ************************************ 00:05:29.375 05:04:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:29.375 * Looking for test storage... 00:05:29.375 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:29.375 05:04:59 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:29.375 05:04:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:29.375 05:04:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:29.375 05:04:59 -- common/autotest_common.sh@10 -- # set +x 00:05:29.375 ************************************ 00:05:29.375 START TEST env_memory 00:05:29.375 ************************************ 00:05:29.375 05:04:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:29.375 00:05:29.375 00:05:29.375 CUnit - A unit testing framework for C - Version 2.1-3 00:05:29.375 http://cunit.sourceforge.net/ 00:05:29.375 00:05:29.375 00:05:29.375 Suite: memory 00:05:29.375 Test: alloc and free memory map ...[2024-07-23 05:04:59.992850] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:29.375 passed 00:05:29.375 Test: mem map translation ...[2024-07-23 05:05:00.011460] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:29.375 [2024-07-23 05:05:00.011483] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:29.375 [2024-07-23 05:05:00.011528] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:29.375 [2024-07-23 05:05:00.011541] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:29.375 passed 00:05:29.375 Test: mem map registration ...[2024-07-23 05:05:00.043138] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:29.375 [2024-07-23 05:05:00.043161] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:29.375 passed 00:05:29.375 Test: mem map adjacent registrations ...passed 00:05:29.375 00:05:29.375 Run Summary: Type Total Ran Passed Failed Inactive 00:05:29.375 suites 1 1 n/a 0 0 00:05:29.375 tests 4 4 4 0 0 00:05:29.375 asserts 152 152 152 0 n/a 00:05:29.375 00:05:29.375 Elapsed time = 0.117 seconds 00:05:29.375 00:05:29.375 real 0m0.131s 00:05:29.375 user 0m0.120s 00:05:29.375 sys 0m0.010s 00:05:29.375 05:05:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.375 05:05:00 -- common/autotest_common.sh@10 -- # set +x 00:05:29.375 ************************************ 00:05:29.375 END TEST env_memory 00:05:29.375 ************************************ 00:05:29.375 05:05:00 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:29.375 05:05:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:29.375 05:05:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:29.375 05:05:00 -- common/autotest_common.sh@10 -- # set +x 00:05:29.375 ************************************ 00:05:29.375 START TEST env_vtophys 00:05:29.375 ************************************ 00:05:29.375 05:05:00 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:29.375 EAL: lib.eal log level changed from notice to debug 00:05:29.375 EAL: Detected lcore 0 as core 0 on socket 0 00:05:29.375 EAL: Detected lcore 1 as core 1 on socket 0 00:05:29.375 EAL: Detected lcore 2 as core 2 on socket 0 00:05:29.375 EAL: Detected lcore 3 as core 3 on socket 0 00:05:29.375 EAL: Detected lcore 4 as core 4 on socket 0 00:05:29.375 EAL: Detected lcore 5 as core 5 on socket 0 00:05:29.375 EAL: Detected lcore 6 as core 6 on socket 0 00:05:29.375 EAL: Detected lcore 7 as core 8 on socket 0 00:05:29.375 EAL: Detected lcore 8 as core 9 on socket 0 00:05:29.375 EAL: Detected lcore 9 as core 10 on socket 0 00:05:29.375 EAL: Detected lcore 10 as core 11 on socket 0 00:05:29.375 EAL: Detected lcore 11 as core 12 on socket 0 00:05:29.375 EAL: Detected lcore 12 as core 13 on socket 0 00:05:29.375 EAL: Detected lcore 13 as core 14 on socket 0 00:05:29.375 EAL: Detected lcore 14 as core 16 on socket 0 00:05:29.375 EAL: Detected lcore 15 as core 17 on socket 0 00:05:29.375 EAL: Detected lcore 16 as core 18 on socket 0 00:05:29.375 EAL: Detected lcore 17 as core 19 on socket 0 00:05:29.375 EAL: Detected lcore 18 as core 20 on socket 0 00:05:29.375 EAL: Detected lcore 19 as core 21 on socket 0 00:05:29.375 EAL: Detected lcore 20 as core 22 on socket 0 00:05:29.375 EAL: Detected lcore 21 as core 24 on socket 0 00:05:29.375 EAL: Detected lcore 22 as core 25 on socket 0 00:05:29.375 EAL: Detected lcore 23 as core 26 on socket 0 00:05:29.375 EAL: Detected lcore 24 as core 27 on socket 0 00:05:29.375 EAL: Detected lcore 25 as core 28 on socket 0 00:05:29.375 EAL: Detected lcore 26 as core 29 on socket 0 00:05:29.375 EAL: Detected lcore 27 as core 30 on socket 0 00:05:29.375 EAL: Detected lcore 28 as core 0 on socket 1 00:05:29.375 EAL: Detected lcore 29 as core 1 on socket 1 00:05:29.375 EAL: Detected lcore 30 as core 2 on socket 1 00:05:29.375 EAL: Detected lcore 31 as core 3 on socket 1 00:05:29.375 EAL: Detected lcore 32 as core 4 on socket 1 00:05:29.375 EAL: Detected lcore 33 as core 5 on socket 1 00:05:29.375 EAL: Detected lcore 34 as core 6 on socket 1 00:05:29.375 EAL: Detected lcore 35 as core 8 on socket 1 00:05:29.375 EAL: Detected lcore 36 as core 9 on socket 1 00:05:29.375 EAL: Detected lcore 37 as core 10 on socket 1 00:05:29.375 EAL: Detected lcore 38 as core 11 on socket 1 00:05:29.375 EAL: Detected lcore 39 as core 12 on socket 1 00:05:29.375 EAL: Detected lcore 40 as core 13 on socket 1 00:05:29.375 EAL: Detected lcore 41 as core 14 on socket 1 00:05:29.375 EAL: Detected lcore 42 as core 16 on socket 1 00:05:29.375 EAL: Detected lcore 43 as core 17 on socket 1 00:05:29.375 EAL: Detected lcore 44 as core 18 on socket 1 00:05:29.375 EAL: Detected lcore 45 as core 19 on socket 1 00:05:29.375 EAL: Detected lcore 46 as core 20 on socket 1 00:05:29.375 EAL: Detected lcore 47 as core 21 on socket 1 00:05:29.375 EAL: Detected lcore 48 as core 22 on socket 1 00:05:29.375 EAL: Detected lcore 49 as core 24 on socket 1 00:05:29.375 EAL: Detected lcore 50 as core 25 on socket 1 00:05:29.375 EAL: Detected lcore 51 as core 26 on socket 1 00:05:29.375 EAL: Detected lcore 52 as core 27 on socket 1 00:05:29.375 EAL: Detected lcore 53 as core 28 on socket 1 00:05:29.375 EAL: Detected lcore 54 as core 29 on socket 1 00:05:29.375 EAL: Detected lcore 55 as core 30 on socket 1 00:05:29.375 EAL: Detected lcore 56 as core 0 on socket 0 00:05:29.375 EAL: Detected lcore 57 as core 1 on socket 0 00:05:29.375 EAL: Detected lcore 58 as core 2 on socket 0 00:05:29.375 EAL: Detected lcore 59 as core 3 on socket 0 00:05:29.375 EAL: Detected lcore 60 as core 4 on socket 0 00:05:29.375 EAL: Detected lcore 61 as core 5 on socket 0 00:05:29.375 EAL: Detected lcore 62 as core 6 on socket 0 00:05:29.375 EAL: Detected lcore 63 as core 8 on socket 0 00:05:29.375 EAL: Detected lcore 64 as core 9 on socket 0 00:05:29.375 EAL: Detected lcore 65 as core 10 on socket 0 00:05:29.375 EAL: Detected lcore 66 as core 11 on socket 0 00:05:29.375 EAL: Detected lcore 67 as core 12 on socket 0 00:05:29.375 EAL: Detected lcore 68 as core 13 on socket 0 00:05:29.375 EAL: Detected lcore 69 as core 14 on socket 0 00:05:29.375 EAL: Detected lcore 70 as core 16 on socket 0 00:05:29.375 EAL: Detected lcore 71 as core 17 on socket 0 00:05:29.375 EAL: Detected lcore 72 as core 18 on socket 0 00:05:29.375 EAL: Detected lcore 73 as core 19 on socket 0 00:05:29.375 EAL: Detected lcore 74 as core 20 on socket 0 00:05:29.375 EAL: Detected lcore 75 as core 21 on socket 0 00:05:29.375 EAL: Detected lcore 76 as core 22 on socket 0 00:05:29.375 EAL: Detected lcore 77 as core 24 on socket 0 00:05:29.375 EAL: Detected lcore 78 as core 25 on socket 0 00:05:29.375 EAL: Detected lcore 79 as core 26 on socket 0 00:05:29.375 EAL: Detected lcore 80 as core 27 on socket 0 00:05:29.375 EAL: Detected lcore 81 as core 28 on socket 0 00:05:29.375 EAL: Detected lcore 82 as core 29 on socket 0 00:05:29.375 EAL: Detected lcore 83 as core 30 on socket 0 00:05:29.375 EAL: Detected lcore 84 as core 0 on socket 1 00:05:29.375 EAL: Detected lcore 85 as core 1 on socket 1 00:05:29.375 EAL: Detected lcore 86 as core 2 on socket 1 00:05:29.375 EAL: Detected lcore 87 as core 3 on socket 1 00:05:29.375 EAL: Detected lcore 88 as core 4 on socket 1 00:05:29.375 EAL: Detected lcore 89 as core 5 on socket 1 00:05:29.375 EAL: Detected lcore 90 as core 6 on socket 1 00:05:29.375 EAL: Detected lcore 91 as core 8 on socket 1 00:05:29.375 EAL: Detected lcore 92 as core 9 on socket 1 00:05:29.375 EAL: Detected lcore 93 as core 10 on socket 1 00:05:29.375 EAL: Detected lcore 94 as core 11 on socket 1 00:05:29.375 EAL: Detected lcore 95 as core 12 on socket 1 00:05:29.375 EAL: Detected lcore 96 as core 13 on socket 1 00:05:29.375 EAL: Detected lcore 97 as core 14 on socket 1 00:05:29.375 EAL: Detected lcore 98 as core 16 on socket 1 00:05:29.375 EAL: Detected lcore 99 as core 17 on socket 1 00:05:29.375 EAL: Detected lcore 100 as core 18 on socket 1 00:05:29.375 EAL: Detected lcore 101 as core 19 on socket 1 00:05:29.375 EAL: Detected lcore 102 as core 20 on socket 1 00:05:29.375 EAL: Detected lcore 103 as core 21 on socket 1 00:05:29.375 EAL: Detected lcore 104 as core 22 on socket 1 00:05:29.375 EAL: Detected lcore 105 as core 24 on socket 1 00:05:29.375 EAL: Detected lcore 106 as core 25 on socket 1 00:05:29.375 EAL: Detected lcore 107 as core 26 on socket 1 00:05:29.375 EAL: Detected lcore 108 as core 27 on socket 1 00:05:29.375 EAL: Detected lcore 109 as core 28 on socket 1 00:05:29.375 EAL: Detected lcore 110 as core 29 on socket 1 00:05:29.375 EAL: Detected lcore 111 as core 30 on socket 1 00:05:29.375 EAL: Maximum logical cores by configuration: 128 00:05:29.375 EAL: Detected CPU lcores: 112 00:05:29.375 EAL: Detected NUMA nodes: 2 00:05:29.375 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:29.375 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:29.375 EAL: Checking presence of .so 'librte_eal.so' 00:05:29.375 EAL: Detected static linkage of DPDK 00:05:29.375 EAL: No shared files mode enabled, IPC will be disabled 00:05:29.375 EAL: Bus pci wants IOVA as 'DC' 00:05:29.375 EAL: Buses did not request a specific IOVA mode. 00:05:29.376 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:29.376 EAL: Selected IOVA mode 'VA' 00:05:29.376 EAL: No free 2048 kB hugepages reported on node 1 00:05:29.376 EAL: Probing VFIO support... 00:05:29.376 EAL: IOMMU type 1 (Type 1) is supported 00:05:29.376 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:29.376 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:29.376 EAL: VFIO support initialized 00:05:29.376 EAL: Ask a virtual area of 0x2e000 bytes 00:05:29.376 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:29.376 EAL: Setting up physically contiguous memory... 00:05:29.376 EAL: Setting maximum number of open files to 524288 00:05:29.376 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:29.376 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:29.376 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:29.376 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.376 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:29.376 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:29.376 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.376 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:29.376 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:29.376 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.376 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:29.376 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:29.376 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.376 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:29.376 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:29.376 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.376 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:29.376 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:29.376 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.376 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:29.376 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:29.376 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.376 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:29.376 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:29.376 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.376 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:29.376 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:29.376 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:29.376 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.376 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:29.376 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:29.376 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.376 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:29.376 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:29.376 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.376 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:29.376 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:29.376 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.376 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:29.376 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:29.376 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.376 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:29.376 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:29.376 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.376 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:29.376 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:29.376 EAL: Ask a virtual area of 0x61000 bytes 00:05:29.376 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:29.376 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:29.376 EAL: Ask a virtual area of 0x400000000 bytes 00:05:29.376 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:29.376 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:29.376 EAL: Hugepages will be freed exactly as allocated. 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: TSC frequency is ~2500000 KHz 00:05:29.376 EAL: Main lcore 0 is ready (tid=7fa75d4ffa00;cpuset=[0]) 00:05:29.376 EAL: Trying to obtain current memory policy. 00:05:29.376 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.376 EAL: Restoring previous memory policy: 0 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was expanded by 2MB 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Mem event callback 'spdk:(nil)' registered 00:05:29.376 00:05:29.376 00:05:29.376 CUnit - A unit testing framework for C - Version 2.1-3 00:05:29.376 http://cunit.sourceforge.net/ 00:05:29.376 00:05:29.376 00:05:29.376 Suite: components_suite 00:05:29.376 Test: vtophys_malloc_test ...passed 00:05:29.376 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:29.376 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.376 EAL: Restoring previous memory policy: 4 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was expanded by 4MB 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was shrunk by 4MB 00:05:29.376 EAL: Trying to obtain current memory policy. 00:05:29.376 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.376 EAL: Restoring previous memory policy: 4 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was expanded by 6MB 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was shrunk by 6MB 00:05:29.376 EAL: Trying to obtain current memory policy. 00:05:29.376 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.376 EAL: Restoring previous memory policy: 4 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was expanded by 10MB 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was shrunk by 10MB 00:05:29.376 EAL: Trying to obtain current memory policy. 00:05:29.376 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.376 EAL: Restoring previous memory policy: 4 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was expanded by 18MB 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was shrunk by 18MB 00:05:29.376 EAL: Trying to obtain current memory policy. 00:05:29.376 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.376 EAL: Restoring previous memory policy: 4 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was expanded by 34MB 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was shrunk by 34MB 00:05:29.376 EAL: Trying to obtain current memory policy. 00:05:29.376 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.376 EAL: Restoring previous memory policy: 4 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was expanded by 66MB 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was shrunk by 66MB 00:05:29.376 EAL: Trying to obtain current memory policy. 00:05:29.376 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.376 EAL: Restoring previous memory policy: 4 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was expanded by 130MB 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was shrunk by 130MB 00:05:29.376 EAL: Trying to obtain current memory policy. 00:05:29.376 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.376 EAL: Restoring previous memory policy: 4 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.376 EAL: request: mp_malloc_sync 00:05:29.376 EAL: No shared files mode enabled, IPC is disabled 00:05:29.376 EAL: Heap on socket 0 was expanded by 258MB 00:05:29.376 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.636 EAL: request: mp_malloc_sync 00:05:29.636 EAL: No shared files mode enabled, IPC is disabled 00:05:29.636 EAL: Heap on socket 0 was shrunk by 258MB 00:05:29.636 EAL: Trying to obtain current memory policy. 00:05:29.636 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.636 EAL: Restoring previous memory policy: 4 00:05:29.636 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.636 EAL: request: mp_malloc_sync 00:05:29.636 EAL: No shared files mode enabled, IPC is disabled 00:05:29.636 EAL: Heap on socket 0 was expanded by 514MB 00:05:29.636 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.895 EAL: request: mp_malloc_sync 00:05:29.895 EAL: No shared files mode enabled, IPC is disabled 00:05:29.895 EAL: Heap on socket 0 was shrunk by 514MB 00:05:29.895 EAL: Trying to obtain current memory policy. 00:05:29.895 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.895 EAL: Restoring previous memory policy: 4 00:05:29.895 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.895 EAL: request: mp_malloc_sync 00:05:29.895 EAL: No shared files mode enabled, IPC is disabled 00:05:29.895 EAL: Heap on socket 0 was expanded by 1026MB 00:05:30.159 EAL: Calling mem event callback 'spdk:(nil)' 00:05:30.421 EAL: request: mp_malloc_sync 00:05:30.421 EAL: No shared files mode enabled, IPC is disabled 00:05:30.421 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:30.421 passed 00:05:30.421 00:05:30.421 Run Summary: Type Total Ran Passed Failed Inactive 00:05:30.421 suites 1 1 n/a 0 0 00:05:30.421 tests 2 2 2 0 0 00:05:30.421 asserts 497 497 497 0 n/a 00:05:30.421 00:05:30.421 Elapsed time = 1.010 seconds 00:05:30.421 EAL: Calling mem event callback 'spdk:(nil)' 00:05:30.421 EAL: request: mp_malloc_sync 00:05:30.421 EAL: No shared files mode enabled, IPC is disabled 00:05:30.421 EAL: Heap on socket 0 was shrunk by 2MB 00:05:30.421 EAL: No shared files mode enabled, IPC is disabled 00:05:30.421 EAL: No shared files mode enabled, IPC is disabled 00:05:30.421 EAL: No shared files mode enabled, IPC is disabled 00:05:30.421 00:05:30.421 real 0m1.152s 00:05:30.421 user 0m0.660s 00:05:30.421 sys 0m0.464s 00:05:30.421 05:05:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:30.421 05:05:01 -- common/autotest_common.sh@10 -- # set +x 00:05:30.421 ************************************ 00:05:30.421 END TEST env_vtophys 00:05:30.421 ************************************ 00:05:30.421 05:05:01 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:30.421 05:05:01 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:30.421 05:05:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:30.421 05:05:01 -- common/autotest_common.sh@10 -- # set +x 00:05:30.421 ************************************ 00:05:30.421 START TEST env_pci 00:05:30.421 ************************************ 00:05:30.421 05:05:01 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:30.421 00:05:30.421 00:05:30.421 CUnit - A unit testing framework for C - Version 2.1-3 00:05:30.421 http://cunit.sourceforge.net/ 00:05:30.421 00:05:30.421 00:05:30.421 Suite: pci 00:05:30.421 Test: pci_hook ...[2024-07-23 05:05:01.351620] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 3113600 has claimed it 00:05:30.421 EAL: Cannot find device (10000:00:01.0) 00:05:30.421 EAL: Failed to attach device on primary process 00:05:30.421 passed 00:05:30.421 00:05:30.421 Run Summary: Type Total Ran Passed Failed Inactive 00:05:30.421 suites 1 1 n/a 0 0 00:05:30.421 tests 1 1 1 0 0 00:05:30.421 asserts 25 25 25 0 n/a 00:05:30.421 00:05:30.421 Elapsed time = 0.039 seconds 00:05:30.421 00:05:30.421 real 0m0.059s 00:05:30.421 user 0m0.017s 00:05:30.421 sys 0m0.041s 00:05:30.421 05:05:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:30.421 05:05:01 -- common/autotest_common.sh@10 -- # set +x 00:05:30.421 ************************************ 00:05:30.421 END TEST env_pci 00:05:30.421 ************************************ 00:05:30.421 05:05:01 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:30.421 05:05:01 -- env/env.sh@15 -- # uname 00:05:30.421 05:05:01 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:30.421 05:05:01 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:30.421 05:05:01 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:30.421 05:05:01 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:05:30.421 05:05:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:30.421 05:05:01 -- common/autotest_common.sh@10 -- # set +x 00:05:30.421 ************************************ 00:05:30.421 START TEST env_dpdk_post_init 00:05:30.421 ************************************ 00:05:30.421 05:05:01 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:30.421 EAL: Detected CPU lcores: 112 00:05:30.421 EAL: Detected NUMA nodes: 2 00:05:30.421 EAL: Detected static linkage of DPDK 00:05:30.421 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:30.421 EAL: Selected IOVA mode 'VA' 00:05:30.421 EAL: No free 2048 kB hugepages reported on node 1 00:05:30.421 EAL: VFIO support initialized 00:05:30.681 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:30.681 EAL: Using IOMMU type 1 (Type 1) 00:05:31.619 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:34.910 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:34.910 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:35.169 Starting DPDK initialization... 00:05:35.169 Starting SPDK post initialization... 00:05:35.169 SPDK NVMe probe 00:05:35.169 Attaching to 0000:d8:00.0 00:05:35.169 Attached to 0000:d8:00.0 00:05:35.169 Cleaning up... 00:05:35.169 00:05:35.169 real 0m4.787s 00:05:35.169 user 0m3.550s 00:05:35.169 sys 0m0.474s 00:05:35.169 05:05:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.169 05:05:06 -- common/autotest_common.sh@10 -- # set +x 00:05:35.169 ************************************ 00:05:35.169 END TEST env_dpdk_post_init 00:05:35.169 ************************************ 00:05:35.428 05:05:06 -- env/env.sh@26 -- # uname 00:05:35.428 05:05:06 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:35.428 05:05:06 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:35.428 05:05:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:35.428 05:05:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:35.428 05:05:06 -- common/autotest_common.sh@10 -- # set +x 00:05:35.428 ************************************ 00:05:35.428 START TEST env_mem_callbacks 00:05:35.428 ************************************ 00:05:35.428 05:05:06 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:35.428 EAL: Detected CPU lcores: 112 00:05:35.428 EAL: Detected NUMA nodes: 2 00:05:35.428 EAL: Detected static linkage of DPDK 00:05:35.428 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:35.428 EAL: Selected IOVA mode 'VA' 00:05:35.428 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.428 EAL: VFIO support initialized 00:05:35.428 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:35.428 00:05:35.428 00:05:35.428 CUnit - A unit testing framework for C - Version 2.1-3 00:05:35.428 http://cunit.sourceforge.net/ 00:05:35.428 00:05:35.428 00:05:35.428 Suite: memory 00:05:35.428 Test: test ... 00:05:35.428 register 0x200000200000 2097152 00:05:35.428 malloc 3145728 00:05:35.428 register 0x200000400000 4194304 00:05:35.428 buf 0x200000500000 len 3145728 PASSED 00:05:35.428 malloc 64 00:05:35.428 buf 0x2000004fff40 len 64 PASSED 00:05:35.428 malloc 4194304 00:05:35.428 register 0x200000800000 6291456 00:05:35.428 buf 0x200000a00000 len 4194304 PASSED 00:05:35.428 free 0x200000500000 3145728 00:05:35.428 free 0x2000004fff40 64 00:05:35.428 unregister 0x200000400000 4194304 PASSED 00:05:35.428 free 0x200000a00000 4194304 00:05:35.428 unregister 0x200000800000 6291456 PASSED 00:05:35.428 malloc 8388608 00:05:35.428 register 0x200000400000 10485760 00:05:35.428 buf 0x200000600000 len 8388608 PASSED 00:05:35.428 free 0x200000600000 8388608 00:05:35.428 unregister 0x200000400000 10485760 PASSED 00:05:35.428 passed 00:05:35.428 00:05:35.428 Run Summary: Type Total Ran Passed Failed Inactive 00:05:35.428 suites 1 1 n/a 0 0 00:05:35.428 tests 1 1 1 0 0 00:05:35.428 asserts 15 15 15 0 n/a 00:05:35.428 00:05:35.428 Elapsed time = 0.008 seconds 00:05:35.428 00:05:35.428 real 0m0.076s 00:05:35.428 user 0m0.019s 00:05:35.428 sys 0m0.057s 00:05:35.428 05:05:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.428 05:05:06 -- common/autotest_common.sh@10 -- # set +x 00:05:35.428 ************************************ 00:05:35.428 END TEST env_mem_callbacks 00:05:35.428 ************************************ 00:05:35.428 00:05:35.428 real 0m6.518s 00:05:35.428 user 0m4.471s 00:05:35.428 sys 0m1.299s 00:05:35.428 05:05:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.428 05:05:06 -- common/autotest_common.sh@10 -- # set +x 00:05:35.428 ************************************ 00:05:35.428 END TEST env 00:05:35.428 ************************************ 00:05:35.428 05:05:06 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:35.428 05:05:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:35.428 05:05:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:35.428 05:05:06 -- common/autotest_common.sh@10 -- # set +x 00:05:35.428 ************************************ 00:05:35.428 START TEST rpc 00:05:35.428 ************************************ 00:05:35.428 05:05:06 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:35.688 * Looking for test storage... 00:05:35.688 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:35.688 05:05:06 -- rpc/rpc.sh@65 -- # spdk_pid=3114539 00:05:35.688 05:05:06 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:35.688 05:05:06 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:35.688 05:05:06 -- rpc/rpc.sh@67 -- # waitforlisten 3114539 00:05:35.688 05:05:06 -- common/autotest_common.sh@819 -- # '[' -z 3114539 ']' 00:05:35.688 05:05:06 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.688 05:05:06 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:35.688 05:05:06 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.688 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.688 05:05:06 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:35.688 05:05:06 -- common/autotest_common.sh@10 -- # set +x 00:05:35.688 [2024-07-23 05:05:06.576672] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:05:35.688 [2024-07-23 05:05:06.576742] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3114539 ] 00:05:35.688 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.688 [2024-07-23 05:05:06.680596] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.947 [2024-07-23 05:05:06.792490] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:35.947 [2024-07-23 05:05:06.792623] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:35.947 [2024-07-23 05:05:06.792638] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 3114539' to capture a snapshot of events at runtime. 00:05:35.947 [2024-07-23 05:05:06.792652] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid3114539 for offline analysis/debug. 00:05:35.947 [2024-07-23 05:05:06.792674] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.516 05:05:07 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:36.516 05:05:07 -- common/autotest_common.sh@852 -- # return 0 00:05:36.516 05:05:07 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:36.516 05:05:07 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:36.516 05:05:07 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:36.516 05:05:07 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:36.516 05:05:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:36.516 05:05:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.516 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.516 ************************************ 00:05:36.516 START TEST rpc_integrity 00:05:36.516 ************************************ 00:05:36.516 05:05:07 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:36.516 05:05:07 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:36.516 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.516 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.516 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.516 05:05:07 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:36.516 05:05:07 -- rpc/rpc.sh@13 -- # jq length 00:05:36.516 05:05:07 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:36.516 05:05:07 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:36.516 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.516 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.516 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.516 05:05:07 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:36.516 05:05:07 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:36.516 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.516 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.776 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.776 05:05:07 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:36.776 { 00:05:36.776 "name": "Malloc0", 00:05:36.776 "aliases": [ 00:05:36.776 "248b19c7-1d8a-4a08-afce-9db93515710f" 00:05:36.776 ], 00:05:36.776 "product_name": "Malloc disk", 00:05:36.776 "block_size": 512, 00:05:36.776 "num_blocks": 16384, 00:05:36.776 "uuid": "248b19c7-1d8a-4a08-afce-9db93515710f", 00:05:36.776 "assigned_rate_limits": { 00:05:36.776 "rw_ios_per_sec": 0, 00:05:36.776 "rw_mbytes_per_sec": 0, 00:05:36.776 "r_mbytes_per_sec": 0, 00:05:36.776 "w_mbytes_per_sec": 0 00:05:36.776 }, 00:05:36.776 "claimed": false, 00:05:36.776 "zoned": false, 00:05:36.776 "supported_io_types": { 00:05:36.776 "read": true, 00:05:36.776 "write": true, 00:05:36.776 "unmap": true, 00:05:36.776 "write_zeroes": true, 00:05:36.776 "flush": true, 00:05:36.776 "reset": true, 00:05:36.776 "compare": false, 00:05:36.776 "compare_and_write": false, 00:05:36.776 "abort": true, 00:05:36.776 "nvme_admin": false, 00:05:36.776 "nvme_io": false 00:05:36.776 }, 00:05:36.776 "memory_domains": [ 00:05:36.776 { 00:05:36.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.776 "dma_device_type": 2 00:05:36.776 } 00:05:36.776 ], 00:05:36.776 "driver_specific": {} 00:05:36.776 } 00:05:36.776 ]' 00:05:36.776 05:05:07 -- rpc/rpc.sh@17 -- # jq length 00:05:36.776 05:05:07 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:36.776 05:05:07 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:36.776 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.776 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.776 [2024-07-23 05:05:07.669081] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:36.776 [2024-07-23 05:05:07.669128] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:36.776 [2024-07-23 05:05:07.669148] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x49625e0 00:05:36.776 [2024-07-23 05:05:07.669160] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:36.776 [2024-07-23 05:05:07.670264] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:36.776 [2024-07-23 05:05:07.670291] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:36.776 Passthru0 00:05:36.776 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.776 05:05:07 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:36.776 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.776 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.777 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.777 05:05:07 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:36.777 { 00:05:36.777 "name": "Malloc0", 00:05:36.777 "aliases": [ 00:05:36.777 "248b19c7-1d8a-4a08-afce-9db93515710f" 00:05:36.777 ], 00:05:36.777 "product_name": "Malloc disk", 00:05:36.777 "block_size": 512, 00:05:36.777 "num_blocks": 16384, 00:05:36.777 "uuid": "248b19c7-1d8a-4a08-afce-9db93515710f", 00:05:36.777 "assigned_rate_limits": { 00:05:36.777 "rw_ios_per_sec": 0, 00:05:36.777 "rw_mbytes_per_sec": 0, 00:05:36.777 "r_mbytes_per_sec": 0, 00:05:36.777 "w_mbytes_per_sec": 0 00:05:36.777 }, 00:05:36.777 "claimed": true, 00:05:36.777 "claim_type": "exclusive_write", 00:05:36.777 "zoned": false, 00:05:36.777 "supported_io_types": { 00:05:36.777 "read": true, 00:05:36.777 "write": true, 00:05:36.777 "unmap": true, 00:05:36.777 "write_zeroes": true, 00:05:36.777 "flush": true, 00:05:36.777 "reset": true, 00:05:36.777 "compare": false, 00:05:36.777 "compare_and_write": false, 00:05:36.777 "abort": true, 00:05:36.777 "nvme_admin": false, 00:05:36.777 "nvme_io": false 00:05:36.777 }, 00:05:36.777 "memory_domains": [ 00:05:36.777 { 00:05:36.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.777 "dma_device_type": 2 00:05:36.777 } 00:05:36.777 ], 00:05:36.777 "driver_specific": {} 00:05:36.777 }, 00:05:36.777 { 00:05:36.777 "name": "Passthru0", 00:05:36.777 "aliases": [ 00:05:36.777 "d6e62777-6bf8-5834-a5bb-6aa82aa5568a" 00:05:36.777 ], 00:05:36.777 "product_name": "passthru", 00:05:36.777 "block_size": 512, 00:05:36.777 "num_blocks": 16384, 00:05:36.777 "uuid": "d6e62777-6bf8-5834-a5bb-6aa82aa5568a", 00:05:36.777 "assigned_rate_limits": { 00:05:36.777 "rw_ios_per_sec": 0, 00:05:36.777 "rw_mbytes_per_sec": 0, 00:05:36.777 "r_mbytes_per_sec": 0, 00:05:36.777 "w_mbytes_per_sec": 0 00:05:36.777 }, 00:05:36.777 "claimed": false, 00:05:36.777 "zoned": false, 00:05:36.777 "supported_io_types": { 00:05:36.777 "read": true, 00:05:36.777 "write": true, 00:05:36.777 "unmap": true, 00:05:36.777 "write_zeroes": true, 00:05:36.777 "flush": true, 00:05:36.777 "reset": true, 00:05:36.777 "compare": false, 00:05:36.777 "compare_and_write": false, 00:05:36.777 "abort": true, 00:05:36.777 "nvme_admin": false, 00:05:36.777 "nvme_io": false 00:05:36.777 }, 00:05:36.777 "memory_domains": [ 00:05:36.777 { 00:05:36.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.777 "dma_device_type": 2 00:05:36.777 } 00:05:36.777 ], 00:05:36.777 "driver_specific": { 00:05:36.777 "passthru": { 00:05:36.777 "name": "Passthru0", 00:05:36.777 "base_bdev_name": "Malloc0" 00:05:36.777 } 00:05:36.777 } 00:05:36.777 } 00:05:36.777 ]' 00:05:36.777 05:05:07 -- rpc/rpc.sh@21 -- # jq length 00:05:36.777 05:05:07 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:36.777 05:05:07 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:36.777 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.777 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.777 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.777 05:05:07 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:36.777 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.777 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.777 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.777 05:05:07 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:36.777 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.777 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.777 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.777 05:05:07 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:36.777 05:05:07 -- rpc/rpc.sh@26 -- # jq length 00:05:36.777 05:05:07 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:36.777 00:05:36.777 real 0m0.300s 00:05:36.777 user 0m0.200s 00:05:36.777 sys 0m0.037s 00:05:36.777 05:05:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.777 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.777 ************************************ 00:05:36.777 END TEST rpc_integrity 00:05:36.777 ************************************ 00:05:36.777 05:05:07 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:36.777 05:05:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:36.777 05:05:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.777 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.777 ************************************ 00:05:36.777 START TEST rpc_plugins 00:05:36.777 ************************************ 00:05:36.777 05:05:07 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:05:36.777 05:05:07 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:36.777 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.777 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:36.777 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.777 05:05:07 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:36.777 05:05:07 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:36.777 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.777 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:37.037 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.037 05:05:07 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:37.037 { 00:05:37.037 "name": "Malloc1", 00:05:37.037 "aliases": [ 00:05:37.037 "0ad7189f-6188-4633-b5b8-398f3a5011df" 00:05:37.037 ], 00:05:37.037 "product_name": "Malloc disk", 00:05:37.037 "block_size": 4096, 00:05:37.037 "num_blocks": 256, 00:05:37.037 "uuid": "0ad7189f-6188-4633-b5b8-398f3a5011df", 00:05:37.037 "assigned_rate_limits": { 00:05:37.037 "rw_ios_per_sec": 0, 00:05:37.037 "rw_mbytes_per_sec": 0, 00:05:37.037 "r_mbytes_per_sec": 0, 00:05:37.037 "w_mbytes_per_sec": 0 00:05:37.037 }, 00:05:37.037 "claimed": false, 00:05:37.037 "zoned": false, 00:05:37.037 "supported_io_types": { 00:05:37.037 "read": true, 00:05:37.037 "write": true, 00:05:37.037 "unmap": true, 00:05:37.037 "write_zeroes": true, 00:05:37.037 "flush": true, 00:05:37.037 "reset": true, 00:05:37.037 "compare": false, 00:05:37.037 "compare_and_write": false, 00:05:37.037 "abort": true, 00:05:37.037 "nvme_admin": false, 00:05:37.037 "nvme_io": false 00:05:37.037 }, 00:05:37.037 "memory_domains": [ 00:05:37.037 { 00:05:37.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:37.037 "dma_device_type": 2 00:05:37.037 } 00:05:37.037 ], 00:05:37.037 "driver_specific": {} 00:05:37.037 } 00:05:37.037 ]' 00:05:37.037 05:05:07 -- rpc/rpc.sh@32 -- # jq length 00:05:37.037 05:05:07 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:37.037 05:05:07 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:37.037 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.037 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:37.037 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.037 05:05:07 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:37.037 05:05:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.037 05:05:07 -- common/autotest_common.sh@10 -- # set +x 00:05:37.037 05:05:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.037 05:05:07 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:37.037 05:05:07 -- rpc/rpc.sh@36 -- # jq length 00:05:37.037 05:05:08 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:37.037 00:05:37.037 real 0m0.166s 00:05:37.037 user 0m0.113s 00:05:37.037 sys 0m0.021s 00:05:37.037 05:05:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.037 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.037 ************************************ 00:05:37.037 END TEST rpc_plugins 00:05:37.037 ************************************ 00:05:37.037 05:05:08 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:37.037 05:05:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:37.037 05:05:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.037 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.037 ************************************ 00:05:37.037 START TEST rpc_trace_cmd_test 00:05:37.037 ************************************ 00:05:37.037 05:05:08 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:05:37.037 05:05:08 -- rpc/rpc.sh@40 -- # local info 00:05:37.037 05:05:08 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:37.037 05:05:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.037 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.037 05:05:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.037 05:05:08 -- rpc/rpc.sh@42 -- # info='{ 00:05:37.037 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid3114539", 00:05:37.037 "tpoint_group_mask": "0x8", 00:05:37.037 "iscsi_conn": { 00:05:37.037 "mask": "0x2", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 }, 00:05:37.037 "scsi": { 00:05:37.037 "mask": "0x4", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 }, 00:05:37.037 "bdev": { 00:05:37.037 "mask": "0x8", 00:05:37.037 "tpoint_mask": "0xffffffffffffffff" 00:05:37.037 }, 00:05:37.037 "nvmf_rdma": { 00:05:37.037 "mask": "0x10", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 }, 00:05:37.037 "nvmf_tcp": { 00:05:37.037 "mask": "0x20", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 }, 00:05:37.037 "ftl": { 00:05:37.037 "mask": "0x40", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 }, 00:05:37.037 "blobfs": { 00:05:37.037 "mask": "0x80", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 }, 00:05:37.037 "dsa": { 00:05:37.037 "mask": "0x200", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 }, 00:05:37.037 "thread": { 00:05:37.037 "mask": "0x400", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 }, 00:05:37.037 "nvme_pcie": { 00:05:37.037 "mask": "0x800", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 }, 00:05:37.037 "iaa": { 00:05:37.037 "mask": "0x1000", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 }, 00:05:37.037 "nvme_tcp": { 00:05:37.037 "mask": "0x2000", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 }, 00:05:37.037 "bdev_nvme": { 00:05:37.037 "mask": "0x4000", 00:05:37.037 "tpoint_mask": "0x0" 00:05:37.037 } 00:05:37.037 }' 00:05:37.037 05:05:08 -- rpc/rpc.sh@43 -- # jq length 00:05:37.297 05:05:08 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:37.297 05:05:08 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:37.297 05:05:08 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:37.297 05:05:08 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:37.297 05:05:08 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:37.297 05:05:08 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:37.297 05:05:08 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:37.297 05:05:08 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:37.297 05:05:08 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:37.297 00:05:37.297 real 0m0.278s 00:05:37.297 user 0m0.237s 00:05:37.297 sys 0m0.033s 00:05:37.297 05:05:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.297 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.297 ************************************ 00:05:37.297 END TEST rpc_trace_cmd_test 00:05:37.297 ************************************ 00:05:37.297 05:05:08 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:37.297 05:05:08 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:37.297 05:05:08 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:37.297 05:05:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:37.297 05:05:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.297 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.556 ************************************ 00:05:37.556 START TEST rpc_daemon_integrity 00:05:37.556 ************************************ 00:05:37.556 05:05:08 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:37.556 05:05:08 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:37.556 05:05:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.556 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.556 05:05:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.556 05:05:08 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:37.556 05:05:08 -- rpc/rpc.sh@13 -- # jq length 00:05:37.556 05:05:08 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:37.556 05:05:08 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:37.556 05:05:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.556 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.556 05:05:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.556 05:05:08 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:37.556 05:05:08 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:37.556 05:05:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.556 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.556 05:05:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.556 05:05:08 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:37.556 { 00:05:37.556 "name": "Malloc2", 00:05:37.556 "aliases": [ 00:05:37.556 "45a4ce25-d324-4ee9-b6f1-da84f03d13fc" 00:05:37.556 ], 00:05:37.556 "product_name": "Malloc disk", 00:05:37.556 "block_size": 512, 00:05:37.556 "num_blocks": 16384, 00:05:37.556 "uuid": "45a4ce25-d324-4ee9-b6f1-da84f03d13fc", 00:05:37.556 "assigned_rate_limits": { 00:05:37.556 "rw_ios_per_sec": 0, 00:05:37.556 "rw_mbytes_per_sec": 0, 00:05:37.556 "r_mbytes_per_sec": 0, 00:05:37.556 "w_mbytes_per_sec": 0 00:05:37.556 }, 00:05:37.556 "claimed": false, 00:05:37.556 "zoned": false, 00:05:37.556 "supported_io_types": { 00:05:37.556 "read": true, 00:05:37.556 "write": true, 00:05:37.556 "unmap": true, 00:05:37.556 "write_zeroes": true, 00:05:37.556 "flush": true, 00:05:37.556 "reset": true, 00:05:37.556 "compare": false, 00:05:37.556 "compare_and_write": false, 00:05:37.556 "abort": true, 00:05:37.556 "nvme_admin": false, 00:05:37.556 "nvme_io": false 00:05:37.556 }, 00:05:37.556 "memory_domains": [ 00:05:37.556 { 00:05:37.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:37.556 "dma_device_type": 2 00:05:37.556 } 00:05:37.556 ], 00:05:37.556 "driver_specific": {} 00:05:37.556 } 00:05:37.556 ]' 00:05:37.556 05:05:08 -- rpc/rpc.sh@17 -- # jq length 00:05:37.556 05:05:08 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:37.556 05:05:08 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:37.556 05:05:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.556 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.556 [2024-07-23 05:05:08.511335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:37.556 [2024-07-23 05:05:08.511372] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:37.556 [2024-07-23 05:05:08.511391] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x47ca5e0 00:05:37.556 [2024-07-23 05:05:08.511403] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:37.556 [2024-07-23 05:05:08.512322] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:37.556 [2024-07-23 05:05:08.512348] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:37.556 Passthru0 00:05:37.556 05:05:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.556 05:05:08 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:37.556 05:05:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.556 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.556 05:05:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.556 05:05:08 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:37.556 { 00:05:37.556 "name": "Malloc2", 00:05:37.556 "aliases": [ 00:05:37.556 "45a4ce25-d324-4ee9-b6f1-da84f03d13fc" 00:05:37.556 ], 00:05:37.556 "product_name": "Malloc disk", 00:05:37.556 "block_size": 512, 00:05:37.556 "num_blocks": 16384, 00:05:37.556 "uuid": "45a4ce25-d324-4ee9-b6f1-da84f03d13fc", 00:05:37.556 "assigned_rate_limits": { 00:05:37.556 "rw_ios_per_sec": 0, 00:05:37.556 "rw_mbytes_per_sec": 0, 00:05:37.556 "r_mbytes_per_sec": 0, 00:05:37.556 "w_mbytes_per_sec": 0 00:05:37.556 }, 00:05:37.556 "claimed": true, 00:05:37.556 "claim_type": "exclusive_write", 00:05:37.556 "zoned": false, 00:05:37.556 "supported_io_types": { 00:05:37.556 "read": true, 00:05:37.556 "write": true, 00:05:37.556 "unmap": true, 00:05:37.556 "write_zeroes": true, 00:05:37.556 "flush": true, 00:05:37.556 "reset": true, 00:05:37.556 "compare": false, 00:05:37.556 "compare_and_write": false, 00:05:37.556 "abort": true, 00:05:37.556 "nvme_admin": false, 00:05:37.556 "nvme_io": false 00:05:37.556 }, 00:05:37.556 "memory_domains": [ 00:05:37.556 { 00:05:37.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:37.556 "dma_device_type": 2 00:05:37.556 } 00:05:37.556 ], 00:05:37.556 "driver_specific": {} 00:05:37.556 }, 00:05:37.556 { 00:05:37.556 "name": "Passthru0", 00:05:37.556 "aliases": [ 00:05:37.556 "905679ce-4950-57c4-ae4b-70a3f30e8eb5" 00:05:37.556 ], 00:05:37.556 "product_name": "passthru", 00:05:37.556 "block_size": 512, 00:05:37.556 "num_blocks": 16384, 00:05:37.556 "uuid": "905679ce-4950-57c4-ae4b-70a3f30e8eb5", 00:05:37.556 "assigned_rate_limits": { 00:05:37.556 "rw_ios_per_sec": 0, 00:05:37.556 "rw_mbytes_per_sec": 0, 00:05:37.556 "r_mbytes_per_sec": 0, 00:05:37.556 "w_mbytes_per_sec": 0 00:05:37.556 }, 00:05:37.556 "claimed": false, 00:05:37.556 "zoned": false, 00:05:37.556 "supported_io_types": { 00:05:37.556 "read": true, 00:05:37.556 "write": true, 00:05:37.556 "unmap": true, 00:05:37.556 "write_zeroes": true, 00:05:37.556 "flush": true, 00:05:37.556 "reset": true, 00:05:37.556 "compare": false, 00:05:37.556 "compare_and_write": false, 00:05:37.556 "abort": true, 00:05:37.556 "nvme_admin": false, 00:05:37.556 "nvme_io": false 00:05:37.556 }, 00:05:37.556 "memory_domains": [ 00:05:37.556 { 00:05:37.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:37.557 "dma_device_type": 2 00:05:37.557 } 00:05:37.557 ], 00:05:37.557 "driver_specific": { 00:05:37.557 "passthru": { 00:05:37.557 "name": "Passthru0", 00:05:37.557 "base_bdev_name": "Malloc2" 00:05:37.557 } 00:05:37.557 } 00:05:37.557 } 00:05:37.557 ]' 00:05:37.557 05:05:08 -- rpc/rpc.sh@21 -- # jq length 00:05:37.557 05:05:08 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:37.557 05:05:08 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:37.557 05:05:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.557 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.557 05:05:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.557 05:05:08 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:37.557 05:05:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.557 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.557 05:05:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.557 05:05:08 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:37.557 05:05:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:37.557 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.557 05:05:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:37.557 05:05:08 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:37.557 05:05:08 -- rpc/rpc.sh@26 -- # jq length 00:05:37.816 05:05:08 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:37.816 00:05:37.816 real 0m0.260s 00:05:37.816 user 0m0.166s 00:05:37.816 sys 0m0.043s 00:05:37.816 05:05:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.816 05:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:37.816 ************************************ 00:05:37.816 END TEST rpc_daemon_integrity 00:05:37.816 ************************************ 00:05:37.816 05:05:08 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:37.816 05:05:08 -- rpc/rpc.sh@84 -- # killprocess 3114539 00:05:37.816 05:05:08 -- common/autotest_common.sh@926 -- # '[' -z 3114539 ']' 00:05:37.816 05:05:08 -- common/autotest_common.sh@930 -- # kill -0 3114539 00:05:37.816 05:05:08 -- common/autotest_common.sh@931 -- # uname 00:05:37.816 05:05:08 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:37.816 05:05:08 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3114539 00:05:37.816 05:05:08 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:37.816 05:05:08 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:37.816 05:05:08 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3114539' 00:05:37.816 killing process with pid 3114539 00:05:37.816 05:05:08 -- common/autotest_common.sh@945 -- # kill 3114539 00:05:37.816 05:05:08 -- common/autotest_common.sh@950 -- # wait 3114539 00:05:38.076 00:05:38.076 real 0m2.615s 00:05:38.076 user 0m3.388s 00:05:38.076 sys 0m0.765s 00:05:38.076 05:05:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:38.076 05:05:09 -- common/autotest_common.sh@10 -- # set +x 00:05:38.076 ************************************ 00:05:38.076 END TEST rpc 00:05:38.076 ************************************ 00:05:38.076 05:05:09 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:38.076 05:05:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:38.076 05:05:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:38.076 05:05:09 -- common/autotest_common.sh@10 -- # set +x 00:05:38.076 ************************************ 00:05:38.076 START TEST rpc_client 00:05:38.076 ************************************ 00:05:38.076 05:05:09 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:38.336 * Looking for test storage... 00:05:38.336 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:38.336 05:05:09 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:38.336 OK 00:05:38.336 05:05:09 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:38.336 00:05:38.336 real 0m0.123s 00:05:38.336 user 0m0.056s 00:05:38.336 sys 0m0.077s 00:05:38.336 05:05:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:38.336 05:05:09 -- common/autotest_common.sh@10 -- # set +x 00:05:38.336 ************************************ 00:05:38.336 END TEST rpc_client 00:05:38.336 ************************************ 00:05:38.336 05:05:09 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:38.336 05:05:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:38.336 05:05:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:38.336 05:05:09 -- common/autotest_common.sh@10 -- # set +x 00:05:38.336 ************************************ 00:05:38.336 START TEST json_config 00:05:38.336 ************************************ 00:05:38.336 05:05:09 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:38.336 05:05:09 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:38.336 05:05:09 -- nvmf/common.sh@7 -- # uname -s 00:05:38.336 05:05:09 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:38.336 05:05:09 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:38.336 05:05:09 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:38.336 05:05:09 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:38.336 05:05:09 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:38.336 05:05:09 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:38.336 05:05:09 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:38.336 05:05:09 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:38.336 05:05:09 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:38.336 05:05:09 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:38.336 05:05:09 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:38.336 05:05:09 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:38.336 05:05:09 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:38.336 05:05:09 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:38.336 05:05:09 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:38.336 05:05:09 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:38.336 05:05:09 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:38.336 05:05:09 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:38.336 05:05:09 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:38.336 05:05:09 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.336 05:05:09 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.336 05:05:09 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.336 05:05:09 -- paths/export.sh@5 -- # export PATH 00:05:38.336 05:05:09 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.336 05:05:09 -- nvmf/common.sh@46 -- # : 0 00:05:38.336 05:05:09 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:38.336 05:05:09 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:38.336 05:05:09 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:38.336 05:05:09 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:38.336 05:05:09 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:38.336 05:05:09 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:38.336 05:05:09 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:38.336 05:05:09 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:38.336 05:05:09 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:38.336 05:05:09 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:38.336 05:05:09 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:38.336 05:05:09 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:38.336 05:05:09 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:38.336 WARNING: No tests are enabled so not running JSON configuration tests 00:05:38.336 05:05:09 -- json_config/json_config.sh@27 -- # exit 0 00:05:38.336 00:05:38.336 real 0m0.101s 00:05:38.336 user 0m0.052s 00:05:38.336 sys 0m0.051s 00:05:38.336 05:05:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:38.336 05:05:09 -- common/autotest_common.sh@10 -- # set +x 00:05:38.336 ************************************ 00:05:38.336 END TEST json_config 00:05:38.336 ************************************ 00:05:38.596 05:05:09 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:38.596 05:05:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:38.596 05:05:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:38.596 05:05:09 -- common/autotest_common.sh@10 -- # set +x 00:05:38.596 ************************************ 00:05:38.596 START TEST json_config_extra_key 00:05:38.596 ************************************ 00:05:38.596 05:05:09 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:38.596 05:05:09 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:38.596 05:05:09 -- nvmf/common.sh@7 -- # uname -s 00:05:38.596 05:05:09 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:38.596 05:05:09 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:38.596 05:05:09 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:38.596 05:05:09 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:38.596 05:05:09 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:38.596 05:05:09 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:38.596 05:05:09 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:38.596 05:05:09 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:38.596 05:05:09 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:38.596 05:05:09 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:38.596 05:05:09 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:38.596 05:05:09 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:38.596 05:05:09 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:38.596 05:05:09 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:38.596 05:05:09 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:38.596 05:05:09 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:38.596 05:05:09 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:38.596 05:05:09 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:38.596 05:05:09 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:38.596 05:05:09 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.596 05:05:09 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.596 05:05:09 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.596 05:05:09 -- paths/export.sh@5 -- # export PATH 00:05:38.596 05:05:09 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:38.596 05:05:09 -- nvmf/common.sh@46 -- # : 0 00:05:38.596 05:05:09 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:38.596 05:05:09 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:38.596 05:05:09 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:38.596 05:05:09 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:38.596 05:05:09 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:38.596 05:05:09 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:38.596 05:05:09 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:38.596 05:05:09 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:38.596 05:05:09 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:38.596 05:05:09 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:38.596 05:05:09 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:38.596 05:05:09 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:38.596 05:05:09 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:38.596 05:05:09 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:38.597 INFO: launching applications... 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=3115313 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:38.597 Waiting for target to run... 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 3115313 /var/tmp/spdk_tgt.sock 00:05:38.597 05:05:09 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:38.597 05:05:09 -- common/autotest_common.sh@819 -- # '[' -z 3115313 ']' 00:05:38.597 05:05:09 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:38.597 05:05:09 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:38.597 05:05:09 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:38.597 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:38.597 05:05:09 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:38.597 05:05:09 -- common/autotest_common.sh@10 -- # set +x 00:05:38.597 [2024-07-23 05:05:09.562263] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:05:38.597 [2024-07-23 05:05:09.562329] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3115313 ] 00:05:38.597 EAL: No free 2048 kB hugepages reported on node 1 00:05:38.856 [2024-07-23 05:05:09.884395] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:39.115 [2024-07-23 05:05:09.960495] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:39.115 [2024-07-23 05:05:09.960611] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:39.373 05:05:10 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:39.373 05:05:10 -- common/autotest_common.sh@852 -- # return 0 00:05:39.373 05:05:10 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:39.373 00:05:39.373 05:05:10 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:39.373 INFO: shutting down applications... 00:05:39.373 05:05:10 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:39.373 05:05:10 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:39.373 05:05:10 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:39.373 05:05:10 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 3115313 ]] 00:05:39.374 05:05:10 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 3115313 00:05:39.374 05:05:10 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:39.374 05:05:10 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:39.374 05:05:10 -- json_config/json_config_extra_key.sh@50 -- # kill -0 3115313 00:05:39.374 05:05:10 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:39.940 05:05:10 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:39.940 05:05:10 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:39.940 05:05:10 -- json_config/json_config_extra_key.sh@50 -- # kill -0 3115313 00:05:39.940 05:05:10 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:39.940 05:05:10 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:39.940 05:05:10 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:39.940 05:05:10 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:39.940 SPDK target shutdown done 00:05:39.940 05:05:10 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:39.940 Success 00:05:39.940 00:05:39.940 real 0m1.519s 00:05:39.940 user 0m1.298s 00:05:39.940 sys 0m0.444s 00:05:39.940 05:05:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:39.940 05:05:10 -- common/autotest_common.sh@10 -- # set +x 00:05:39.940 ************************************ 00:05:39.940 END TEST json_config_extra_key 00:05:39.940 ************************************ 00:05:39.940 05:05:10 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:39.940 05:05:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:39.940 05:05:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:39.940 05:05:10 -- common/autotest_common.sh@10 -- # set +x 00:05:39.940 ************************************ 00:05:39.940 START TEST alias_rpc 00:05:39.940 ************************************ 00:05:39.940 05:05:11 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:40.198 * Looking for test storage... 00:05:40.198 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:40.198 05:05:11 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:40.198 05:05:11 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:40.198 05:05:11 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=3115629 00:05:40.198 05:05:11 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 3115629 00:05:40.198 05:05:11 -- common/autotest_common.sh@819 -- # '[' -z 3115629 ']' 00:05:40.198 05:05:11 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:40.198 05:05:11 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:40.198 05:05:11 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:40.198 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:40.198 05:05:11 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:40.198 05:05:11 -- common/autotest_common.sh@10 -- # set +x 00:05:40.198 [2024-07-23 05:05:11.127527] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:05:40.198 [2024-07-23 05:05:11.127597] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3115629 ] 00:05:40.198 EAL: No free 2048 kB hugepages reported on node 1 00:05:40.198 [2024-07-23 05:05:11.225571] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:40.457 [2024-07-23 05:05:11.312795] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:40.457 [2024-07-23 05:05:11.312924] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.025 05:05:12 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:41.025 05:05:12 -- common/autotest_common.sh@852 -- # return 0 00:05:41.025 05:05:12 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:41.284 05:05:12 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 3115629 00:05:41.284 05:05:12 -- common/autotest_common.sh@926 -- # '[' -z 3115629 ']' 00:05:41.284 05:05:12 -- common/autotest_common.sh@930 -- # kill -0 3115629 00:05:41.284 05:05:12 -- common/autotest_common.sh@931 -- # uname 00:05:41.284 05:05:12 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:41.284 05:05:12 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3115629 00:05:41.284 05:05:12 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:41.284 05:05:12 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:41.284 05:05:12 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3115629' 00:05:41.284 killing process with pid 3115629 00:05:41.284 05:05:12 -- common/autotest_common.sh@945 -- # kill 3115629 00:05:41.284 05:05:12 -- common/autotest_common.sh@950 -- # wait 3115629 00:05:41.853 00:05:41.853 real 0m1.658s 00:05:41.853 user 0m1.847s 00:05:41.853 sys 0m0.490s 00:05:41.853 05:05:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:41.853 05:05:12 -- common/autotest_common.sh@10 -- # set +x 00:05:41.853 ************************************ 00:05:41.853 END TEST alias_rpc 00:05:41.853 ************************************ 00:05:41.853 05:05:12 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:05:41.853 05:05:12 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:41.853 05:05:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:41.853 05:05:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:41.853 05:05:12 -- common/autotest_common.sh@10 -- # set +x 00:05:41.853 ************************************ 00:05:41.853 START TEST spdkcli_tcp 00:05:41.853 ************************************ 00:05:41.853 05:05:12 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:41.853 * Looking for test storage... 00:05:41.853 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:41.853 05:05:12 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:41.853 05:05:12 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:41.853 05:05:12 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:41.853 05:05:12 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:41.853 05:05:12 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:41.853 05:05:12 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:41.853 05:05:12 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:41.853 05:05:12 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:41.853 05:05:12 -- common/autotest_common.sh@10 -- # set +x 00:05:41.853 05:05:12 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=3115957 00:05:41.853 05:05:12 -- spdkcli/tcp.sh@27 -- # waitforlisten 3115957 00:05:41.853 05:05:12 -- common/autotest_common.sh@819 -- # '[' -z 3115957 ']' 00:05:41.853 05:05:12 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:41.853 05:05:12 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:41.853 05:05:12 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:41.853 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:41.853 05:05:12 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:41.853 05:05:12 -- common/autotest_common.sh@10 -- # set +x 00:05:41.853 05:05:12 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:41.853 [2024-07-23 05:05:12.842531] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:05:41.853 [2024-07-23 05:05:12.842622] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3115957 ] 00:05:41.853 EAL: No free 2048 kB hugepages reported on node 1 00:05:41.853 [2024-07-23 05:05:12.941505] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:42.112 [2024-07-23 05:05:13.030279] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:42.112 [2024-07-23 05:05:13.030432] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:42.112 [2024-07-23 05:05:13.030436] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:43.055 05:05:14 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:43.055 05:05:14 -- common/autotest_common.sh@852 -- # return 0 00:05:43.055 05:05:14 -- spdkcli/tcp.sh@31 -- # socat_pid=3116228 00:05:43.055 05:05:14 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:43.055 05:05:14 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:43.314 [ 00:05:43.314 "spdk_get_version", 00:05:43.314 "rpc_get_methods", 00:05:43.314 "trace_get_info", 00:05:43.314 "trace_get_tpoint_group_mask", 00:05:43.314 "trace_disable_tpoint_group", 00:05:43.314 "trace_enable_tpoint_group", 00:05:43.314 "trace_clear_tpoint_mask", 00:05:43.314 "trace_set_tpoint_mask", 00:05:43.314 "vfu_tgt_set_base_path", 00:05:43.314 "framework_get_pci_devices", 00:05:43.314 "framework_get_config", 00:05:43.314 "framework_get_subsystems", 00:05:43.314 "iobuf_get_stats", 00:05:43.314 "iobuf_set_options", 00:05:43.314 "sock_set_default_impl", 00:05:43.314 "sock_impl_set_options", 00:05:43.314 "sock_impl_get_options", 00:05:43.314 "vmd_rescan", 00:05:43.314 "vmd_remove_device", 00:05:43.314 "vmd_enable", 00:05:43.314 "accel_get_stats", 00:05:43.314 "accel_set_options", 00:05:43.314 "accel_set_driver", 00:05:43.315 "accel_crypto_key_destroy", 00:05:43.315 "accel_crypto_keys_get", 00:05:43.315 "accel_crypto_key_create", 00:05:43.315 "accel_assign_opc", 00:05:43.315 "accel_get_module_info", 00:05:43.315 "accel_get_opc_assignments", 00:05:43.315 "notify_get_notifications", 00:05:43.315 "notify_get_types", 00:05:43.315 "bdev_get_histogram", 00:05:43.315 "bdev_enable_histogram", 00:05:43.315 "bdev_set_qos_limit", 00:05:43.315 "bdev_set_qd_sampling_period", 00:05:43.315 "bdev_get_bdevs", 00:05:43.315 "bdev_reset_iostat", 00:05:43.315 "bdev_get_iostat", 00:05:43.315 "bdev_examine", 00:05:43.315 "bdev_wait_for_examine", 00:05:43.315 "bdev_set_options", 00:05:43.315 "scsi_get_devices", 00:05:43.315 "thread_set_cpumask", 00:05:43.315 "framework_get_scheduler", 00:05:43.315 "framework_set_scheduler", 00:05:43.315 "framework_get_reactors", 00:05:43.315 "thread_get_io_channels", 00:05:43.315 "thread_get_pollers", 00:05:43.315 "thread_get_stats", 00:05:43.315 "framework_monitor_context_switch", 00:05:43.315 "spdk_kill_instance", 00:05:43.315 "log_enable_timestamps", 00:05:43.315 "log_get_flags", 00:05:43.315 "log_clear_flag", 00:05:43.315 "log_set_flag", 00:05:43.315 "log_get_level", 00:05:43.315 "log_set_level", 00:05:43.315 "log_get_print_level", 00:05:43.315 "log_set_print_level", 00:05:43.315 "framework_enable_cpumask_locks", 00:05:43.315 "framework_disable_cpumask_locks", 00:05:43.315 "framework_wait_init", 00:05:43.315 "framework_start_init", 00:05:43.315 "virtio_blk_create_transport", 00:05:43.315 "virtio_blk_get_transports", 00:05:43.315 "vhost_controller_set_coalescing", 00:05:43.315 "vhost_get_controllers", 00:05:43.315 "vhost_delete_controller", 00:05:43.315 "vhost_create_blk_controller", 00:05:43.315 "vhost_scsi_controller_remove_target", 00:05:43.315 "vhost_scsi_controller_add_target", 00:05:43.315 "vhost_start_scsi_controller", 00:05:43.315 "vhost_create_scsi_controller", 00:05:43.315 "ublk_recover_disk", 00:05:43.315 "ublk_get_disks", 00:05:43.315 "ublk_stop_disk", 00:05:43.315 "ublk_start_disk", 00:05:43.315 "ublk_destroy_target", 00:05:43.315 "ublk_create_target", 00:05:43.315 "nbd_get_disks", 00:05:43.315 "nbd_stop_disk", 00:05:43.315 "nbd_start_disk", 00:05:43.315 "env_dpdk_get_mem_stats", 00:05:43.315 "nvmf_subsystem_get_listeners", 00:05:43.315 "nvmf_subsystem_get_qpairs", 00:05:43.315 "nvmf_subsystem_get_controllers", 00:05:43.315 "nvmf_get_stats", 00:05:43.315 "nvmf_get_transports", 00:05:43.315 "nvmf_create_transport", 00:05:43.315 "nvmf_get_targets", 00:05:43.315 "nvmf_delete_target", 00:05:43.315 "nvmf_create_target", 00:05:43.315 "nvmf_subsystem_allow_any_host", 00:05:43.315 "nvmf_subsystem_remove_host", 00:05:43.315 "nvmf_subsystem_add_host", 00:05:43.315 "nvmf_subsystem_remove_ns", 00:05:43.315 "nvmf_subsystem_add_ns", 00:05:43.315 "nvmf_subsystem_listener_set_ana_state", 00:05:43.315 "nvmf_discovery_get_referrals", 00:05:43.315 "nvmf_discovery_remove_referral", 00:05:43.315 "nvmf_discovery_add_referral", 00:05:43.315 "nvmf_subsystem_remove_listener", 00:05:43.315 "nvmf_subsystem_add_listener", 00:05:43.315 "nvmf_delete_subsystem", 00:05:43.315 "nvmf_create_subsystem", 00:05:43.315 "nvmf_get_subsystems", 00:05:43.315 "nvmf_set_crdt", 00:05:43.315 "nvmf_set_config", 00:05:43.315 "nvmf_set_max_subsystems", 00:05:43.315 "iscsi_set_options", 00:05:43.315 "iscsi_get_auth_groups", 00:05:43.315 "iscsi_auth_group_remove_secret", 00:05:43.315 "iscsi_auth_group_add_secret", 00:05:43.315 "iscsi_delete_auth_group", 00:05:43.315 "iscsi_create_auth_group", 00:05:43.315 "iscsi_set_discovery_auth", 00:05:43.315 "iscsi_get_options", 00:05:43.315 "iscsi_target_node_request_logout", 00:05:43.315 "iscsi_target_node_set_redirect", 00:05:43.315 "iscsi_target_node_set_auth", 00:05:43.315 "iscsi_target_node_add_lun", 00:05:43.315 "iscsi_get_connections", 00:05:43.315 "iscsi_portal_group_set_auth", 00:05:43.315 "iscsi_start_portal_group", 00:05:43.315 "iscsi_delete_portal_group", 00:05:43.315 "iscsi_create_portal_group", 00:05:43.315 "iscsi_get_portal_groups", 00:05:43.315 "iscsi_delete_target_node", 00:05:43.315 "iscsi_target_node_remove_pg_ig_maps", 00:05:43.315 "iscsi_target_node_add_pg_ig_maps", 00:05:43.315 "iscsi_create_target_node", 00:05:43.315 "iscsi_get_target_nodes", 00:05:43.315 "iscsi_delete_initiator_group", 00:05:43.315 "iscsi_initiator_group_remove_initiators", 00:05:43.315 "iscsi_initiator_group_add_initiators", 00:05:43.315 "iscsi_create_initiator_group", 00:05:43.315 "iscsi_get_initiator_groups", 00:05:43.315 "vfu_virtio_create_scsi_endpoint", 00:05:43.315 "vfu_virtio_scsi_remove_target", 00:05:43.315 "vfu_virtio_scsi_add_target", 00:05:43.315 "vfu_virtio_create_blk_endpoint", 00:05:43.315 "vfu_virtio_delete_endpoint", 00:05:43.315 "iaa_scan_accel_module", 00:05:43.315 "dsa_scan_accel_module", 00:05:43.315 "ioat_scan_accel_module", 00:05:43.315 "accel_error_inject_error", 00:05:43.315 "bdev_iscsi_delete", 00:05:43.315 "bdev_iscsi_create", 00:05:43.315 "bdev_iscsi_set_options", 00:05:43.315 "bdev_virtio_attach_controller", 00:05:43.315 "bdev_virtio_scsi_get_devices", 00:05:43.315 "bdev_virtio_detach_controller", 00:05:43.315 "bdev_virtio_blk_set_hotplug", 00:05:43.315 "bdev_ftl_set_property", 00:05:43.315 "bdev_ftl_get_properties", 00:05:43.315 "bdev_ftl_get_stats", 00:05:43.315 "bdev_ftl_unmap", 00:05:43.315 "bdev_ftl_unload", 00:05:43.315 "bdev_ftl_delete", 00:05:43.315 "bdev_ftl_load", 00:05:43.315 "bdev_ftl_create", 00:05:43.315 "bdev_aio_delete", 00:05:43.315 "bdev_aio_rescan", 00:05:43.315 "bdev_aio_create", 00:05:43.315 "blobfs_create", 00:05:43.315 "blobfs_detect", 00:05:43.315 "blobfs_set_cache_size", 00:05:43.315 "bdev_zone_block_delete", 00:05:43.315 "bdev_zone_block_create", 00:05:43.315 "bdev_delay_delete", 00:05:43.315 "bdev_delay_create", 00:05:43.315 "bdev_delay_update_latency", 00:05:43.315 "bdev_split_delete", 00:05:43.315 "bdev_split_create", 00:05:43.315 "bdev_error_inject_error", 00:05:43.315 "bdev_error_delete", 00:05:43.315 "bdev_error_create", 00:05:43.315 "bdev_raid_set_options", 00:05:43.315 "bdev_raid_remove_base_bdev", 00:05:43.315 "bdev_raid_add_base_bdev", 00:05:43.315 "bdev_raid_delete", 00:05:43.315 "bdev_raid_create", 00:05:43.315 "bdev_raid_get_bdevs", 00:05:43.315 "bdev_lvol_grow_lvstore", 00:05:43.315 "bdev_lvol_get_lvols", 00:05:43.315 "bdev_lvol_get_lvstores", 00:05:43.315 "bdev_lvol_delete", 00:05:43.315 "bdev_lvol_set_read_only", 00:05:43.315 "bdev_lvol_resize", 00:05:43.315 "bdev_lvol_decouple_parent", 00:05:43.315 "bdev_lvol_inflate", 00:05:43.315 "bdev_lvol_rename", 00:05:43.315 "bdev_lvol_clone_bdev", 00:05:43.315 "bdev_lvol_clone", 00:05:43.315 "bdev_lvol_snapshot", 00:05:43.315 "bdev_lvol_create", 00:05:43.315 "bdev_lvol_delete_lvstore", 00:05:43.315 "bdev_lvol_rename_lvstore", 00:05:43.315 "bdev_lvol_create_lvstore", 00:05:43.315 "bdev_passthru_delete", 00:05:43.315 "bdev_passthru_create", 00:05:43.315 "bdev_nvme_cuse_unregister", 00:05:43.315 "bdev_nvme_cuse_register", 00:05:43.315 "bdev_opal_new_user", 00:05:43.315 "bdev_opal_set_lock_state", 00:05:43.315 "bdev_opal_delete", 00:05:43.315 "bdev_opal_get_info", 00:05:43.315 "bdev_opal_create", 00:05:43.315 "bdev_nvme_opal_revert", 00:05:43.315 "bdev_nvme_opal_init", 00:05:43.315 "bdev_nvme_send_cmd", 00:05:43.315 "bdev_nvme_get_path_iostat", 00:05:43.315 "bdev_nvme_get_mdns_discovery_info", 00:05:43.315 "bdev_nvme_stop_mdns_discovery", 00:05:43.315 "bdev_nvme_start_mdns_discovery", 00:05:43.315 "bdev_nvme_set_multipath_policy", 00:05:43.315 "bdev_nvme_set_preferred_path", 00:05:43.315 "bdev_nvme_get_io_paths", 00:05:43.315 "bdev_nvme_remove_error_injection", 00:05:43.315 "bdev_nvme_add_error_injection", 00:05:43.315 "bdev_nvme_get_discovery_info", 00:05:43.315 "bdev_nvme_stop_discovery", 00:05:43.315 "bdev_nvme_start_discovery", 00:05:43.315 "bdev_nvme_get_controller_health_info", 00:05:43.315 "bdev_nvme_disable_controller", 00:05:43.315 "bdev_nvme_enable_controller", 00:05:43.315 "bdev_nvme_reset_controller", 00:05:43.315 "bdev_nvme_get_transport_statistics", 00:05:43.315 "bdev_nvme_apply_firmware", 00:05:43.315 "bdev_nvme_detach_controller", 00:05:43.315 "bdev_nvme_get_controllers", 00:05:43.315 "bdev_nvme_attach_controller", 00:05:43.315 "bdev_nvme_set_hotplug", 00:05:43.315 "bdev_nvme_set_options", 00:05:43.315 "bdev_null_resize", 00:05:43.315 "bdev_null_delete", 00:05:43.316 "bdev_null_create", 00:05:43.316 "bdev_malloc_delete", 00:05:43.316 "bdev_malloc_create" 00:05:43.316 ] 00:05:43.316 05:05:14 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:43.316 05:05:14 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:43.316 05:05:14 -- common/autotest_common.sh@10 -- # set +x 00:05:43.316 05:05:14 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:43.316 05:05:14 -- spdkcli/tcp.sh@38 -- # killprocess 3115957 00:05:43.316 05:05:14 -- common/autotest_common.sh@926 -- # '[' -z 3115957 ']' 00:05:43.316 05:05:14 -- common/autotest_common.sh@930 -- # kill -0 3115957 00:05:43.316 05:05:14 -- common/autotest_common.sh@931 -- # uname 00:05:43.316 05:05:14 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:43.316 05:05:14 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3115957 00:05:43.316 05:05:14 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:43.316 05:05:14 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:43.316 05:05:14 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3115957' 00:05:43.316 killing process with pid 3115957 00:05:43.316 05:05:14 -- common/autotest_common.sh@945 -- # kill 3115957 00:05:43.316 05:05:14 -- common/autotest_common.sh@950 -- # wait 3115957 00:05:43.596 00:05:43.596 real 0m1.946s 00:05:43.596 user 0m3.923s 00:05:43.596 sys 0m0.561s 00:05:43.596 05:05:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:43.596 05:05:14 -- common/autotest_common.sh@10 -- # set +x 00:05:43.596 ************************************ 00:05:43.596 END TEST spdkcli_tcp 00:05:43.596 ************************************ 00:05:43.870 05:05:14 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:43.870 05:05:14 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:43.870 05:05:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:43.870 05:05:14 -- common/autotest_common.sh@10 -- # set +x 00:05:43.870 ************************************ 00:05:43.870 START TEST dpdk_mem_utility 00:05:43.870 ************************************ 00:05:43.870 05:05:14 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:43.870 * Looking for test storage... 00:05:43.870 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:43.870 05:05:14 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:43.870 05:05:14 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=3116349 00:05:43.870 05:05:14 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 3116349 00:05:43.871 05:05:14 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:43.871 05:05:14 -- common/autotest_common.sh@819 -- # '[' -z 3116349 ']' 00:05:43.871 05:05:14 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:43.871 05:05:14 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:43.871 05:05:14 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:43.871 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:43.871 05:05:14 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:43.871 05:05:14 -- common/autotest_common.sh@10 -- # set +x 00:05:43.871 [2024-07-23 05:05:14.836140] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:05:43.871 [2024-07-23 05:05:14.836218] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3116349 ] 00:05:43.871 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.871 [2024-07-23 05:05:14.933856] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:44.130 [2024-07-23 05:05:15.024106] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:44.130 [2024-07-23 05:05:15.024240] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.699 05:05:15 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:44.699 05:05:15 -- common/autotest_common.sh@852 -- # return 0 00:05:44.699 05:05:15 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:44.699 05:05:15 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:44.699 05:05:15 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:44.699 05:05:15 -- common/autotest_common.sh@10 -- # set +x 00:05:44.699 { 00:05:44.699 "filename": "/tmp/spdk_mem_dump.txt" 00:05:44.699 } 00:05:44.699 05:05:15 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:44.699 05:05:15 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:44.959 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:44.959 1 heaps totaling size 814.000000 MiB 00:05:44.959 size: 814.000000 MiB heap id: 0 00:05:44.959 end heaps---------- 00:05:44.959 8 mempools totaling size 598.116089 MiB 00:05:44.959 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:44.959 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:44.959 size: 84.521057 MiB name: bdev_io_3116349 00:05:44.959 size: 51.011292 MiB name: evtpool_3116349 00:05:44.959 size: 50.003479 MiB name: msgpool_3116349 00:05:44.959 size: 21.763794 MiB name: PDU_Pool 00:05:44.959 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:44.959 size: 0.026123 MiB name: Session_Pool 00:05:44.959 end mempools------- 00:05:44.959 6 memzones totaling size 4.142822 MiB 00:05:44.959 size: 1.000366 MiB name: RG_ring_0_3116349 00:05:44.959 size: 1.000366 MiB name: RG_ring_1_3116349 00:05:44.959 size: 1.000366 MiB name: RG_ring_4_3116349 00:05:44.959 size: 1.000366 MiB name: RG_ring_5_3116349 00:05:44.959 size: 0.125366 MiB name: RG_ring_2_3116349 00:05:44.959 size: 0.015991 MiB name: RG_ring_3_3116349 00:05:44.959 end memzones------- 00:05:44.959 05:05:15 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:44.959 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:44.959 list of free elements. size: 12.519348 MiB 00:05:44.959 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:44.959 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:44.959 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:44.959 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:44.959 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:44.959 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:44.959 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:44.959 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:44.959 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:44.959 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:44.959 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:44.959 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:44.959 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:44.959 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:44.959 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:44.959 list of standard malloc elements. size: 199.218079 MiB 00:05:44.959 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:44.959 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:44.959 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:44.959 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:44.959 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:44.959 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:44.959 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:44.959 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:44.959 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:44.959 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:44.959 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:44.959 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:44.959 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:44.959 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:44.959 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:44.959 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:44.959 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:44.959 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:44.959 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:44.959 list of memzone associated elements. size: 602.262573 MiB 00:05:44.960 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:44.960 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:44.960 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:44.960 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:44.960 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:44.960 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_3116349_0 00:05:44.960 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:44.960 associated memzone info: size: 48.002930 MiB name: MP_evtpool_3116349_0 00:05:44.960 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:44.960 associated memzone info: size: 48.002930 MiB name: MP_msgpool_3116349_0 00:05:44.960 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:44.960 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:44.960 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:44.960 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:44.960 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:44.960 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_3116349 00:05:44.960 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:44.960 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_3116349 00:05:44.960 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:44.960 associated memzone info: size: 1.007996 MiB name: MP_evtpool_3116349 00:05:44.960 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:44.960 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:44.960 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:44.960 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:44.960 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:44.960 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:44.960 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:44.960 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:44.960 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:44.960 associated memzone info: size: 1.000366 MiB name: RG_ring_0_3116349 00:05:44.960 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:44.960 associated memzone info: size: 1.000366 MiB name: RG_ring_1_3116349 00:05:44.960 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:44.960 associated memzone info: size: 1.000366 MiB name: RG_ring_4_3116349 00:05:44.960 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:44.960 associated memzone info: size: 1.000366 MiB name: RG_ring_5_3116349 00:05:44.960 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:44.960 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_3116349 00:05:44.960 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:44.960 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:44.960 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:44.960 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:44.960 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:44.960 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:44.960 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:44.960 associated memzone info: size: 0.125366 MiB name: RG_ring_2_3116349 00:05:44.960 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:44.960 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:44.960 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:44.960 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:44.960 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:44.960 associated memzone info: size: 0.015991 MiB name: RG_ring_3_3116349 00:05:44.960 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:44.960 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:44.960 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:44.960 associated memzone info: size: 0.000183 MiB name: MP_msgpool_3116349 00:05:44.960 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:44.960 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_3116349 00:05:44.960 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:44.960 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:44.960 05:05:15 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:44.960 05:05:15 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 3116349 00:05:44.960 05:05:15 -- common/autotest_common.sh@926 -- # '[' -z 3116349 ']' 00:05:44.960 05:05:15 -- common/autotest_common.sh@930 -- # kill -0 3116349 00:05:44.960 05:05:15 -- common/autotest_common.sh@931 -- # uname 00:05:44.960 05:05:15 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:44.960 05:05:15 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3116349 00:05:44.960 05:05:15 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:44.960 05:05:15 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:44.960 05:05:15 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3116349' 00:05:44.960 killing process with pid 3116349 00:05:44.960 05:05:15 -- common/autotest_common.sh@945 -- # kill 3116349 00:05:44.960 05:05:15 -- common/autotest_common.sh@950 -- # wait 3116349 00:05:45.220 00:05:45.220 real 0m1.547s 00:05:45.220 user 0m1.652s 00:05:45.220 sys 0m0.491s 00:05:45.220 05:05:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:45.220 05:05:16 -- common/autotest_common.sh@10 -- # set +x 00:05:45.220 ************************************ 00:05:45.220 END TEST dpdk_mem_utility 00:05:45.220 ************************************ 00:05:45.220 05:05:16 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:45.220 05:05:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:45.220 05:05:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:45.220 05:05:16 -- common/autotest_common.sh@10 -- # set +x 00:05:45.220 ************************************ 00:05:45.220 START TEST event 00:05:45.220 ************************************ 00:05:45.220 05:05:16 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:45.479 * Looking for test storage... 00:05:45.479 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:45.480 05:05:16 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:45.480 05:05:16 -- bdev/nbd_common.sh@6 -- # set -e 00:05:45.480 05:05:16 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:45.480 05:05:16 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:05:45.480 05:05:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:45.480 05:05:16 -- common/autotest_common.sh@10 -- # set +x 00:05:45.480 ************************************ 00:05:45.480 START TEST event_perf 00:05:45.480 ************************************ 00:05:45.480 05:05:16 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:45.480 Running I/O for 1 seconds...[2024-07-23 05:05:16.422110] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:05:45.480 [2024-07-23 05:05:16.422173] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3116744 ] 00:05:45.480 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.480 [2024-07-23 05:05:16.516207] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:45.739 [2024-07-23 05:05:16.605054] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:45.739 [2024-07-23 05:05:16.605150] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:45.739 [2024-07-23 05:05:16.605232] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:45.739 [2024-07-23 05:05:16.605235] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.677 Running I/O for 1 seconds... 00:05:46.677 lcore 0: 171668 00:05:46.677 lcore 1: 171667 00:05:46.677 lcore 2: 171669 00:05:46.677 lcore 3: 171671 00:05:46.677 done. 00:05:46.677 00:05:46.677 real 0m1.271s 00:05:46.677 user 0m4.162s 00:05:46.677 sys 0m0.106s 00:05:46.677 05:05:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:46.677 05:05:17 -- common/autotest_common.sh@10 -- # set +x 00:05:46.677 ************************************ 00:05:46.677 END TEST event_perf 00:05:46.677 ************************************ 00:05:46.678 05:05:17 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:46.678 05:05:17 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:46.678 05:05:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:46.678 05:05:17 -- common/autotest_common.sh@10 -- # set +x 00:05:46.678 ************************************ 00:05:46.678 START TEST event_reactor 00:05:46.678 ************************************ 00:05:46.678 05:05:17 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:46.678 [2024-07-23 05:05:17.753791] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:05:46.678 [2024-07-23 05:05:17.753885] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3116927 ] 00:05:46.937 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.937 [2024-07-23 05:05:17.841765] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.938 [2024-07-23 05:05:17.923718] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.317 test_start 00:05:48.317 oneshot 00:05:48.317 tick 100 00:05:48.317 tick 100 00:05:48.317 tick 250 00:05:48.317 tick 100 00:05:48.317 tick 100 00:05:48.317 tick 100 00:05:48.317 tick 250 00:05:48.317 tick 500 00:05:48.317 tick 100 00:05:48.318 tick 100 00:05:48.318 tick 250 00:05:48.318 tick 100 00:05:48.318 tick 100 00:05:48.318 test_end 00:05:48.318 00:05:48.318 real 0m1.261s 00:05:48.318 user 0m1.152s 00:05:48.318 sys 0m0.103s 00:05:48.318 05:05:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.318 05:05:18 -- common/autotest_common.sh@10 -- # set +x 00:05:48.318 ************************************ 00:05:48.318 END TEST event_reactor 00:05:48.318 ************************************ 00:05:48.318 05:05:19 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:48.318 05:05:19 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:48.318 05:05:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.318 05:05:19 -- common/autotest_common.sh@10 -- # set +x 00:05:48.318 ************************************ 00:05:48.318 START TEST event_reactor_perf 00:05:48.318 ************************************ 00:05:48.318 05:05:19 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:48.318 [2024-07-23 05:05:19.063760] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:05:48.318 [2024-07-23 05:05:19.063882] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3117199 ] 00:05:48.318 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.318 [2024-07-23 05:05:19.162702] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.318 [2024-07-23 05:05:19.244205] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.258 test_start 00:05:49.258 test_end 00:05:49.258 Performance: 613785 events per second 00:05:49.258 00:05:49.258 real 0m1.275s 00:05:49.258 user 0m1.157s 00:05:49.258 sys 0m0.112s 00:05:49.258 05:05:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.258 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:49.258 ************************************ 00:05:49.258 END TEST event_reactor_perf 00:05:49.258 ************************************ 00:05:49.518 05:05:20 -- event/event.sh@49 -- # uname -s 00:05:49.518 05:05:20 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:49.518 05:05:20 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:49.518 05:05:20 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:49.518 05:05:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:49.518 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:49.518 ************************************ 00:05:49.518 START TEST event_scheduler 00:05:49.518 ************************************ 00:05:49.518 05:05:20 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:49.518 * Looking for test storage... 00:05:49.518 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:49.518 05:05:20 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:49.518 05:05:20 -- scheduler/scheduler.sh@35 -- # scheduler_pid=3117515 00:05:49.518 05:05:20 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:49.518 05:05:20 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:49.518 05:05:20 -- scheduler/scheduler.sh@37 -- # waitforlisten 3117515 00:05:49.518 05:05:20 -- common/autotest_common.sh@819 -- # '[' -z 3117515 ']' 00:05:49.518 05:05:20 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:49.518 05:05:20 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:49.518 05:05:20 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:49.518 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:49.518 05:05:20 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:49.518 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:49.518 [2024-07-23 05:05:20.495467] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:05:49.518 [2024-07-23 05:05:20.495563] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3117515 ] 00:05:49.518 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.518 [2024-07-23 05:05:20.571593] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:49.778 [2024-07-23 05:05:20.648788] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.778 [2024-07-23 05:05:20.648884] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:49.778 [2024-07-23 05:05:20.648966] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:49.778 [2024-07-23 05:05:20.648968] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:49.778 05:05:20 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:49.778 05:05:20 -- common/autotest_common.sh@852 -- # return 0 00:05:49.778 05:05:20 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:49.778 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:49.778 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:49.778 POWER: Env isn't set yet! 00:05:49.778 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:49.778 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:49.778 POWER: Cannot set governor of lcore 0 to userspace 00:05:49.778 POWER: Attempting to initialise PSTAT power management... 00:05:49.778 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:49.778 POWER: Initialized successfully for lcore 0 power management 00:05:49.778 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:49.778 POWER: Initialized successfully for lcore 1 power management 00:05:49.778 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:49.778 POWER: Initialized successfully for lcore 2 power management 00:05:49.778 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:49.778 POWER: Initialized successfully for lcore 3 power management 00:05:49.778 [2024-07-23 05:05:20.733606] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:49.778 [2024-07-23 05:05:20.733618] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:49.778 [2024-07-23 05:05:20.733626] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:49.778 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:49.778 05:05:20 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:49.778 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:49.778 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:49.778 [2024-07-23 05:05:20.808353] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:49.778 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:49.778 05:05:20 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:49.778 05:05:20 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:49.778 05:05:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:49.778 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:49.778 ************************************ 00:05:49.778 START TEST scheduler_create_thread 00:05:49.778 ************************************ 00:05:49.778 05:05:20 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:05:49.778 05:05:20 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:49.778 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:49.778 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:49.778 2 00:05:49.778 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:49.778 05:05:20 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:49.778 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:49.778 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:49.778 3 00:05:49.778 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:49.778 05:05:20 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:49.779 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:49.779 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:49.779 4 00:05:49.779 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:49.779 05:05:20 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:49.779 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:49.779 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:49.779 5 00:05:49.779 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:49.779 05:05:20 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:49.779 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:49.779 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:50.038 6 00:05:50.038 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:50.038 05:05:20 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:50.038 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:50.038 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:50.038 7 00:05:50.038 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:50.038 05:05:20 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:50.038 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:50.038 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:50.038 8 00:05:50.038 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:50.038 05:05:20 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:50.038 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:50.038 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:50.038 9 00:05:50.038 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:50.038 05:05:20 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:50.038 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:50.038 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:50.038 10 00:05:50.038 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:50.038 05:05:20 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:50.038 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:50.038 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:50.038 05:05:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:50.038 05:05:20 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:50.038 05:05:20 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:50.038 05:05:20 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:50.038 05:05:20 -- common/autotest_common.sh@10 -- # set +x 00:05:50.976 05:05:21 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:50.976 05:05:21 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:50.977 05:05:21 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:50.977 05:05:21 -- common/autotest_common.sh@10 -- # set +x 00:05:52.359 05:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:52.359 05:05:23 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:52.359 05:05:23 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:52.359 05:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:52.359 05:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:53.298 05:05:24 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:53.298 00:05:53.298 real 0m3.383s 00:05:53.298 user 0m0.024s 00:05:53.298 sys 0m0.008s 00:05:53.298 05:05:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:53.298 05:05:24 -- common/autotest_common.sh@10 -- # set +x 00:05:53.298 ************************************ 00:05:53.298 END TEST scheduler_create_thread 00:05:53.298 ************************************ 00:05:53.298 05:05:24 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:53.298 05:05:24 -- scheduler/scheduler.sh@46 -- # killprocess 3117515 00:05:53.298 05:05:24 -- common/autotest_common.sh@926 -- # '[' -z 3117515 ']' 00:05:53.298 05:05:24 -- common/autotest_common.sh@930 -- # kill -0 3117515 00:05:53.298 05:05:24 -- common/autotest_common.sh@931 -- # uname 00:05:53.298 05:05:24 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:53.298 05:05:24 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3117515 00:05:53.298 05:05:24 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:05:53.298 05:05:24 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:05:53.298 05:05:24 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3117515' 00:05:53.298 killing process with pid 3117515 00:05:53.298 05:05:24 -- common/autotest_common.sh@945 -- # kill 3117515 00:05:53.298 05:05:24 -- common/autotest_common.sh@950 -- # wait 3117515 00:05:53.556 [2024-07-23 05:05:24.580676] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:53.815 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:53.815 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:53.815 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:53.815 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:53.815 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:53.815 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:53.815 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:53.815 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:53.815 00:05:53.815 real 0m4.427s 00:05:53.815 user 0m7.815s 00:05:53.815 sys 0m0.374s 00:05:53.815 05:05:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:53.815 05:05:24 -- common/autotest_common.sh@10 -- # set +x 00:05:53.815 ************************************ 00:05:53.815 END TEST event_scheduler 00:05:53.815 ************************************ 00:05:53.815 05:05:24 -- event/event.sh@51 -- # modprobe -n nbd 00:05:53.815 05:05:24 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:53.815 05:05:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:53.815 05:05:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:53.815 05:05:24 -- common/autotest_common.sh@10 -- # set +x 00:05:53.815 ************************************ 00:05:53.815 START TEST app_repeat 00:05:53.815 ************************************ 00:05:53.815 05:05:24 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:05:53.815 05:05:24 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:53.815 05:05:24 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:53.815 05:05:24 -- event/event.sh@13 -- # local nbd_list 00:05:53.815 05:05:24 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:53.815 05:05:24 -- event/event.sh@14 -- # local bdev_list 00:05:53.815 05:05:24 -- event/event.sh@15 -- # local repeat_times=4 00:05:53.815 05:05:24 -- event/event.sh@17 -- # modprobe nbd 00:05:53.815 05:05:24 -- event/event.sh@19 -- # repeat_pid=3118365 00:05:53.815 05:05:24 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:53.815 05:05:24 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:53.815 05:05:24 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 3118365' 00:05:53.815 Process app_repeat pid: 3118365 00:05:53.815 05:05:24 -- event/event.sh@23 -- # for i in {0..2} 00:05:53.815 05:05:24 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:53.815 spdk_app_start Round 0 00:05:53.815 05:05:24 -- event/event.sh@25 -- # waitforlisten 3118365 /var/tmp/spdk-nbd.sock 00:05:53.815 05:05:24 -- common/autotest_common.sh@819 -- # '[' -z 3118365 ']' 00:05:53.815 05:05:24 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:53.815 05:05:24 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:53.815 05:05:24 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:53.815 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:53.815 05:05:24 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:53.815 05:05:24 -- common/autotest_common.sh@10 -- # set +x 00:05:53.815 [2024-07-23 05:05:24.889670] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:05:53.816 [2024-07-23 05:05:24.889768] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3118365 ] 00:05:54.075 EAL: No free 2048 kB hugepages reported on node 1 00:05:54.075 [2024-07-23 05:05:24.989581] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:54.075 [2024-07-23 05:05:25.074696] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:54.075 [2024-07-23 05:05:25.074700] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.012 05:05:25 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:55.012 05:05:25 -- common/autotest_common.sh@852 -- # return 0 00:05:55.012 05:05:25 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:55.012 Malloc0 00:05:55.012 05:05:26 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:55.271 Malloc1 00:05:55.271 05:05:26 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@12 -- # local i 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:55.271 05:05:26 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:55.531 /dev/nbd0 00:05:55.531 05:05:26 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:55.531 05:05:26 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:55.531 05:05:26 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:55.531 05:05:26 -- common/autotest_common.sh@857 -- # local i 00:05:55.531 05:05:26 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:55.531 05:05:26 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:55.531 05:05:26 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:55.531 05:05:26 -- common/autotest_common.sh@861 -- # break 00:05:55.531 05:05:26 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:55.531 05:05:26 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:55.531 05:05:26 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:55.531 1+0 records in 00:05:55.531 1+0 records out 00:05:55.531 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251984 s, 16.3 MB/s 00:05:55.531 05:05:26 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:55.531 05:05:26 -- common/autotest_common.sh@874 -- # size=4096 00:05:55.531 05:05:26 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:55.531 05:05:26 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:55.531 05:05:26 -- common/autotest_common.sh@877 -- # return 0 00:05:55.531 05:05:26 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:55.531 05:05:26 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:55.531 05:05:26 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:55.790 /dev/nbd1 00:05:55.790 05:05:26 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:55.790 05:05:26 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:55.790 05:05:26 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:55.790 05:05:26 -- common/autotest_common.sh@857 -- # local i 00:05:55.790 05:05:26 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:55.790 05:05:26 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:55.790 05:05:26 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:55.790 05:05:26 -- common/autotest_common.sh@861 -- # break 00:05:55.790 05:05:26 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:55.790 05:05:26 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:55.790 05:05:26 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:55.790 1+0 records in 00:05:55.790 1+0 records out 00:05:55.790 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265778 s, 15.4 MB/s 00:05:55.790 05:05:26 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:55.790 05:05:26 -- common/autotest_common.sh@874 -- # size=4096 00:05:55.790 05:05:26 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:55.790 05:05:26 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:55.790 05:05:26 -- common/autotest_common.sh@877 -- # return 0 00:05:55.790 05:05:26 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:55.790 05:05:26 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:55.790 05:05:26 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:55.791 05:05:26 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:55.791 05:05:26 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:56.050 05:05:26 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:56.050 { 00:05:56.050 "nbd_device": "/dev/nbd0", 00:05:56.050 "bdev_name": "Malloc0" 00:05:56.050 }, 00:05:56.050 { 00:05:56.050 "nbd_device": "/dev/nbd1", 00:05:56.050 "bdev_name": "Malloc1" 00:05:56.050 } 00:05:56.050 ]' 00:05:56.050 05:05:26 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:56.050 { 00:05:56.050 "nbd_device": "/dev/nbd0", 00:05:56.050 "bdev_name": "Malloc0" 00:05:56.050 }, 00:05:56.050 { 00:05:56.050 "nbd_device": "/dev/nbd1", 00:05:56.050 "bdev_name": "Malloc1" 00:05:56.050 } 00:05:56.050 ]' 00:05:56.050 05:05:26 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:56.050 /dev/nbd1' 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:56.050 /dev/nbd1' 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@65 -- # count=2 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@95 -- # count=2 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:56.050 256+0 records in 00:05:56.050 256+0 records out 00:05:56.050 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00558408 s, 188 MB/s 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:56.050 256+0 records in 00:05:56.050 256+0 records out 00:05:56.050 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0280767 s, 37.3 MB/s 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:56.050 256+0 records in 00:05:56.050 256+0 records out 00:05:56.050 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0222263 s, 47.2 MB/s 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@51 -- # local i 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:56.050 05:05:27 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:56.309 05:05:27 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:56.309 05:05:27 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:56.309 05:05:27 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:56.309 05:05:27 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:56.309 05:05:27 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:56.309 05:05:27 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:56.309 05:05:27 -- bdev/nbd_common.sh@41 -- # break 00:05:56.309 05:05:27 -- bdev/nbd_common.sh@45 -- # return 0 00:05:56.309 05:05:27 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:56.309 05:05:27 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:56.568 05:05:27 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:56.568 05:05:27 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:56.568 05:05:27 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:56.568 05:05:27 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:56.568 05:05:27 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:56.568 05:05:27 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:56.568 05:05:27 -- bdev/nbd_common.sh@41 -- # break 00:05:56.568 05:05:27 -- bdev/nbd_common.sh@45 -- # return 0 00:05:56.568 05:05:27 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:56.568 05:05:27 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:56.568 05:05:27 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:56.832 05:05:27 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:56.832 05:05:27 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:56.832 05:05:27 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:56.833 05:05:27 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:56.833 05:05:27 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:56.833 05:05:27 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:56.833 05:05:27 -- bdev/nbd_common.sh@65 -- # true 00:05:56.833 05:05:27 -- bdev/nbd_common.sh@65 -- # count=0 00:05:56.833 05:05:27 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:56.833 05:05:27 -- bdev/nbd_common.sh@104 -- # count=0 00:05:56.833 05:05:27 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:56.833 05:05:27 -- bdev/nbd_common.sh@109 -- # return 0 00:05:56.833 05:05:27 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:57.092 05:05:28 -- event/event.sh@35 -- # sleep 3 00:05:57.351 [2024-07-23 05:05:28.266775] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:57.351 [2024-07-23 05:05:28.343975] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:57.351 [2024-07-23 05:05:28.343980] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.351 [2024-07-23 05:05:28.387061] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:57.351 [2024-07-23 05:05:28.387110] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:00.640 05:05:31 -- event/event.sh@23 -- # for i in {0..2} 00:06:00.640 05:05:31 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:00.640 spdk_app_start Round 1 00:06:00.640 05:05:31 -- event/event.sh@25 -- # waitforlisten 3118365 /var/tmp/spdk-nbd.sock 00:06:00.640 05:05:31 -- common/autotest_common.sh@819 -- # '[' -z 3118365 ']' 00:06:00.640 05:05:31 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:00.640 05:05:31 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:00.640 05:05:31 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:00.640 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:00.640 05:05:31 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:00.640 05:05:31 -- common/autotest_common.sh@10 -- # set +x 00:06:00.640 05:05:31 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:00.640 05:05:31 -- common/autotest_common.sh@852 -- # return 0 00:06:00.640 05:05:31 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:00.640 Malloc0 00:06:00.640 05:05:31 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:00.640 Malloc1 00:06:00.640 05:05:31 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@12 -- # local i 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:00.640 05:05:31 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:00.899 /dev/nbd0 00:06:00.899 05:05:31 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:00.899 05:05:31 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:00.899 05:05:31 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:06:00.899 05:05:31 -- common/autotest_common.sh@857 -- # local i 00:06:00.899 05:05:31 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:00.899 05:05:31 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:00.899 05:05:31 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:06:00.899 05:05:31 -- common/autotest_common.sh@861 -- # break 00:06:00.899 05:05:31 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:00.899 05:05:31 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:00.899 05:05:31 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:00.899 1+0 records in 00:06:00.899 1+0 records out 00:06:00.899 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000222789 s, 18.4 MB/s 00:06:00.899 05:05:31 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:00.899 05:05:31 -- common/autotest_common.sh@874 -- # size=4096 00:06:00.899 05:05:31 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:00.899 05:05:31 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:00.899 05:05:31 -- common/autotest_common.sh@877 -- # return 0 00:06:00.899 05:05:31 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:00.899 05:05:31 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:00.899 05:05:31 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:01.158 /dev/nbd1 00:06:01.158 05:05:32 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:01.158 05:05:32 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:01.158 05:05:32 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:06:01.158 05:05:32 -- common/autotest_common.sh@857 -- # local i 00:06:01.158 05:05:32 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:01.158 05:05:32 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:01.158 05:05:32 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:06:01.158 05:05:32 -- common/autotest_common.sh@861 -- # break 00:06:01.158 05:05:32 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:01.158 05:05:32 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:01.158 05:05:32 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:01.158 1+0 records in 00:06:01.158 1+0 records out 00:06:01.158 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280313 s, 14.6 MB/s 00:06:01.158 05:05:32 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:01.158 05:05:32 -- common/autotest_common.sh@874 -- # size=4096 00:06:01.158 05:05:32 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:01.158 05:05:32 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:01.158 05:05:32 -- common/autotest_common.sh@877 -- # return 0 00:06:01.158 05:05:32 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:01.158 05:05:32 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:01.158 05:05:32 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:01.158 05:05:32 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:01.158 05:05:32 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:01.417 05:05:32 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:01.417 { 00:06:01.417 "nbd_device": "/dev/nbd0", 00:06:01.417 "bdev_name": "Malloc0" 00:06:01.417 }, 00:06:01.417 { 00:06:01.417 "nbd_device": "/dev/nbd1", 00:06:01.417 "bdev_name": "Malloc1" 00:06:01.417 } 00:06:01.417 ]' 00:06:01.417 05:05:32 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:01.417 { 00:06:01.417 "nbd_device": "/dev/nbd0", 00:06:01.417 "bdev_name": "Malloc0" 00:06:01.417 }, 00:06:01.417 { 00:06:01.417 "nbd_device": "/dev/nbd1", 00:06:01.417 "bdev_name": "Malloc1" 00:06:01.417 } 00:06:01.417 ]' 00:06:01.417 05:05:32 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:01.417 05:05:32 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:01.417 /dev/nbd1' 00:06:01.417 05:05:32 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:01.417 /dev/nbd1' 00:06:01.417 05:05:32 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:01.417 05:05:32 -- bdev/nbd_common.sh@65 -- # count=2 00:06:01.417 05:05:32 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:01.417 05:05:32 -- bdev/nbd_common.sh@95 -- # count=2 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:01.676 256+0 records in 00:06:01.676 256+0 records out 00:06:01.676 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105433 s, 99.5 MB/s 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:01.676 256+0 records in 00:06:01.676 256+0 records out 00:06:01.676 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0227117 s, 46.2 MB/s 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:01.676 256+0 records in 00:06:01.676 256+0 records out 00:06:01.676 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0295435 s, 35.5 MB/s 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@51 -- # local i 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:01.676 05:05:32 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:01.935 05:05:32 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:01.935 05:05:32 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:01.935 05:05:32 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:01.935 05:05:32 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:01.935 05:05:32 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:01.935 05:05:32 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:01.935 05:05:32 -- bdev/nbd_common.sh@41 -- # break 00:06:01.935 05:05:32 -- bdev/nbd_common.sh@45 -- # return 0 00:06:01.935 05:05:32 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:01.935 05:05:32 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:02.194 05:05:33 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:02.194 05:05:33 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:02.194 05:05:33 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:02.194 05:05:33 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:02.194 05:05:33 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:02.194 05:05:33 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:02.194 05:05:33 -- bdev/nbd_common.sh@41 -- # break 00:06:02.194 05:05:33 -- bdev/nbd_common.sh@45 -- # return 0 00:06:02.194 05:05:33 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:02.194 05:05:33 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.194 05:05:33 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@65 -- # true 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@65 -- # count=0 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@104 -- # count=0 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:02.452 05:05:33 -- bdev/nbd_common.sh@109 -- # return 0 00:06:02.452 05:05:33 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:02.710 05:05:33 -- event/event.sh@35 -- # sleep 3 00:06:02.970 [2024-07-23 05:05:33.825050] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:02.970 [2024-07-23 05:05:33.902669] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:02.970 [2024-07-23 05:05:33.902673] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:02.970 [2024-07-23 05:05:33.945272] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:02.970 [2024-07-23 05:05:33.945336] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:06.260 05:05:36 -- event/event.sh@23 -- # for i in {0..2} 00:06:06.260 05:05:36 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:06.260 spdk_app_start Round 2 00:06:06.260 05:05:36 -- event/event.sh@25 -- # waitforlisten 3118365 /var/tmp/spdk-nbd.sock 00:06:06.260 05:05:36 -- common/autotest_common.sh@819 -- # '[' -z 3118365 ']' 00:06:06.260 05:05:36 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:06.260 05:05:36 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:06.260 05:05:36 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:06.260 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:06.260 05:05:36 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:06.260 05:05:36 -- common/autotest_common.sh@10 -- # set +x 00:06:06.260 05:05:36 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:06.260 05:05:36 -- common/autotest_common.sh@852 -- # return 0 00:06:06.260 05:05:36 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:06.260 Malloc0 00:06:06.260 05:05:37 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:06.260 Malloc1 00:06:06.260 05:05:37 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@12 -- # local i 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:06.260 05:05:37 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:06.519 /dev/nbd0 00:06:06.519 05:05:37 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:06.519 05:05:37 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:06.519 05:05:37 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:06:06.519 05:05:37 -- common/autotest_common.sh@857 -- # local i 00:06:06.519 05:05:37 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:06.519 05:05:37 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:06.519 05:05:37 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:06:06.519 05:05:37 -- common/autotest_common.sh@861 -- # break 00:06:06.519 05:05:37 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:06.519 05:05:37 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:06.519 05:05:37 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:06.519 1+0 records in 00:06:06.519 1+0 records out 00:06:06.519 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253849 s, 16.1 MB/s 00:06:06.519 05:05:37 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:06.519 05:05:37 -- common/autotest_common.sh@874 -- # size=4096 00:06:06.519 05:05:37 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:06.519 05:05:37 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:06.519 05:05:37 -- common/autotest_common.sh@877 -- # return 0 00:06:06.519 05:05:37 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:06.519 05:05:37 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:06.519 05:05:37 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:06.779 /dev/nbd1 00:06:06.779 05:05:37 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:06.779 05:05:37 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:06.779 05:05:37 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:06:06.779 05:05:37 -- common/autotest_common.sh@857 -- # local i 00:06:06.779 05:05:37 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:06.779 05:05:37 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:06.779 05:05:37 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:06:06.779 05:05:37 -- common/autotest_common.sh@861 -- # break 00:06:06.779 05:05:37 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:06.779 05:05:37 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:06.779 05:05:37 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:06.779 1+0 records in 00:06:06.779 1+0 records out 00:06:06.779 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256022 s, 16.0 MB/s 00:06:06.779 05:05:37 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:06.779 05:05:37 -- common/autotest_common.sh@874 -- # size=4096 00:06:06.779 05:05:37 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:06.779 05:05:37 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:06.779 05:05:37 -- common/autotest_common.sh@877 -- # return 0 00:06:06.779 05:05:37 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:06.779 05:05:37 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:06.779 05:05:37 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:06.779 05:05:37 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:06.779 05:05:37 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:07.039 05:05:37 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:07.039 { 00:06:07.039 "nbd_device": "/dev/nbd0", 00:06:07.039 "bdev_name": "Malloc0" 00:06:07.039 }, 00:06:07.039 { 00:06:07.039 "nbd_device": "/dev/nbd1", 00:06:07.039 "bdev_name": "Malloc1" 00:06:07.039 } 00:06:07.039 ]' 00:06:07.039 05:05:37 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:07.039 { 00:06:07.039 "nbd_device": "/dev/nbd0", 00:06:07.039 "bdev_name": "Malloc0" 00:06:07.039 }, 00:06:07.039 { 00:06:07.039 "nbd_device": "/dev/nbd1", 00:06:07.039 "bdev_name": "Malloc1" 00:06:07.039 } 00:06:07.039 ]' 00:06:07.039 05:05:37 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:07.039 /dev/nbd1' 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:07.039 /dev/nbd1' 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@65 -- # count=2 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@95 -- # count=2 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:07.039 256+0 records in 00:06:07.039 256+0 records out 00:06:07.039 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106372 s, 98.6 MB/s 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:07.039 256+0 records in 00:06:07.039 256+0 records out 00:06:07.039 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0281593 s, 37.2 MB/s 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:07.039 256+0 records in 00:06:07.039 256+0 records out 00:06:07.039 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0300202 s, 34.9 MB/s 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:07.039 05:05:38 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@51 -- # local i 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@41 -- # break 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@45 -- # return 0 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:07.299 05:05:38 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:07.559 05:05:38 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:07.559 05:05:38 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:07.559 05:05:38 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:07.559 05:05:38 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:07.559 05:05:38 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:07.559 05:05:38 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:07.559 05:05:38 -- bdev/nbd_common.sh@41 -- # break 00:06:07.559 05:05:38 -- bdev/nbd_common.sh@45 -- # return 0 00:06:07.559 05:05:38 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:07.559 05:05:38 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.559 05:05:38 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@65 -- # true 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@65 -- # count=0 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@104 -- # count=0 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:07.818 05:05:38 -- bdev/nbd_common.sh@109 -- # return 0 00:06:07.818 05:05:38 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:08.077 05:05:39 -- event/event.sh@35 -- # sleep 3 00:06:08.337 [2024-07-23 05:05:39.349612] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:08.337 [2024-07-23 05:05:39.426383] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:08.337 [2024-07-23 05:05:39.426389] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.596 [2024-07-23 05:05:39.469495] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:08.596 [2024-07-23 05:05:39.469547] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:11.133 05:05:42 -- event/event.sh@38 -- # waitforlisten 3118365 /var/tmp/spdk-nbd.sock 00:06:11.133 05:05:42 -- common/autotest_common.sh@819 -- # '[' -z 3118365 ']' 00:06:11.133 05:05:42 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:11.133 05:05:42 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:11.133 05:05:42 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:11.133 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:11.133 05:05:42 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:11.133 05:05:42 -- common/autotest_common.sh@10 -- # set +x 00:06:11.392 05:05:42 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:11.392 05:05:42 -- common/autotest_common.sh@852 -- # return 0 00:06:11.392 05:05:42 -- event/event.sh@39 -- # killprocess 3118365 00:06:11.392 05:05:42 -- common/autotest_common.sh@926 -- # '[' -z 3118365 ']' 00:06:11.392 05:05:42 -- common/autotest_common.sh@930 -- # kill -0 3118365 00:06:11.392 05:05:42 -- common/autotest_common.sh@931 -- # uname 00:06:11.392 05:05:42 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:11.392 05:05:42 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3118365 00:06:11.392 05:05:42 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:11.392 05:05:42 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:11.392 05:05:42 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3118365' 00:06:11.392 killing process with pid 3118365 00:06:11.392 05:05:42 -- common/autotest_common.sh@945 -- # kill 3118365 00:06:11.392 05:05:42 -- common/autotest_common.sh@950 -- # wait 3118365 00:06:11.651 spdk_app_start is called in Round 0. 00:06:11.651 Shutdown signal received, stop current app iteration 00:06:11.651 Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 reinitialization... 00:06:11.651 spdk_app_start is called in Round 1. 00:06:11.651 Shutdown signal received, stop current app iteration 00:06:11.651 Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 reinitialization... 00:06:11.651 spdk_app_start is called in Round 2. 00:06:11.651 Shutdown signal received, stop current app iteration 00:06:11.651 Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 reinitialization... 00:06:11.651 spdk_app_start is called in Round 3. 00:06:11.651 Shutdown signal received, stop current app iteration 00:06:11.651 05:05:42 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:11.651 05:05:42 -- event/event.sh@42 -- # return 0 00:06:11.651 00:06:11.651 real 0m17.733s 00:06:11.651 user 0m38.100s 00:06:11.651 sys 0m3.623s 00:06:11.651 05:05:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.651 05:05:42 -- common/autotest_common.sh@10 -- # set +x 00:06:11.651 ************************************ 00:06:11.651 END TEST app_repeat 00:06:11.651 ************************************ 00:06:11.651 05:05:42 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:11.651 05:05:42 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:11.651 05:05:42 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:11.651 05:05:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:11.651 05:05:42 -- common/autotest_common.sh@10 -- # set +x 00:06:11.651 ************************************ 00:06:11.651 START TEST cpu_locks 00:06:11.651 ************************************ 00:06:11.651 05:05:42 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:11.651 * Looking for test storage... 00:06:11.910 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:11.910 05:05:42 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:11.910 05:05:42 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:11.910 05:05:42 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:11.910 05:05:42 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:11.910 05:05:42 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:11.910 05:05:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:11.910 05:05:42 -- common/autotest_common.sh@10 -- # set +x 00:06:11.910 ************************************ 00:06:11.910 START TEST default_locks 00:06:11.910 ************************************ 00:06:11.910 05:05:42 -- common/autotest_common.sh@1104 -- # default_locks 00:06:11.910 05:05:42 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=3121654 00:06:11.910 05:05:42 -- event/cpu_locks.sh@47 -- # waitforlisten 3121654 00:06:11.910 05:05:42 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:11.910 05:05:42 -- common/autotest_common.sh@819 -- # '[' -z 3121654 ']' 00:06:11.910 05:05:42 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:11.910 05:05:42 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:11.910 05:05:42 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:11.910 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:11.910 05:05:42 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:11.910 05:05:42 -- common/autotest_common.sh@10 -- # set +x 00:06:11.910 [2024-07-23 05:05:42.787307] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:11.910 [2024-07-23 05:05:42.787379] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3121654 ] 00:06:11.910 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.910 [2024-07-23 05:05:42.882268] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.910 [2024-07-23 05:05:42.967343] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:11.910 [2024-07-23 05:05:42.967485] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.860 05:05:43 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:12.860 05:05:43 -- common/autotest_common.sh@852 -- # return 0 00:06:12.860 05:05:43 -- event/cpu_locks.sh@49 -- # locks_exist 3121654 00:06:12.860 05:05:43 -- event/cpu_locks.sh@22 -- # lslocks -p 3121654 00:06:12.860 05:05:43 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:13.428 lslocks: write error 00:06:13.428 05:05:44 -- event/cpu_locks.sh@50 -- # killprocess 3121654 00:06:13.428 05:05:44 -- common/autotest_common.sh@926 -- # '[' -z 3121654 ']' 00:06:13.428 05:05:44 -- common/autotest_common.sh@930 -- # kill -0 3121654 00:06:13.428 05:05:44 -- common/autotest_common.sh@931 -- # uname 00:06:13.428 05:05:44 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:13.428 05:05:44 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3121654 00:06:13.429 05:05:44 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:13.429 05:05:44 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:13.429 05:05:44 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3121654' 00:06:13.429 killing process with pid 3121654 00:06:13.429 05:05:44 -- common/autotest_common.sh@945 -- # kill 3121654 00:06:13.429 05:05:44 -- common/autotest_common.sh@950 -- # wait 3121654 00:06:13.688 05:05:44 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 3121654 00:06:13.688 05:05:44 -- common/autotest_common.sh@640 -- # local es=0 00:06:13.688 05:05:44 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 3121654 00:06:13.688 05:05:44 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:13.688 05:05:44 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:13.688 05:05:44 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:13.688 05:05:44 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:13.688 05:05:44 -- common/autotest_common.sh@643 -- # waitforlisten 3121654 00:06:13.688 05:05:44 -- common/autotest_common.sh@819 -- # '[' -z 3121654 ']' 00:06:13.688 05:05:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:13.688 05:05:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:13.688 05:05:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:13.688 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:13.688 05:05:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:13.688 05:05:44 -- common/autotest_common.sh@10 -- # set +x 00:06:13.689 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (3121654) - No such process 00:06:13.689 ERROR: process (pid: 3121654) is no longer running 00:06:13.689 05:05:44 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:13.689 05:05:44 -- common/autotest_common.sh@852 -- # return 1 00:06:13.689 05:05:44 -- common/autotest_common.sh@643 -- # es=1 00:06:13.689 05:05:44 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:13.689 05:05:44 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:13.689 05:05:44 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:13.689 05:05:44 -- event/cpu_locks.sh@54 -- # no_locks 00:06:13.689 05:05:44 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:13.689 05:05:44 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:13.689 05:05:44 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:13.689 00:06:13.689 real 0m1.950s 00:06:13.689 user 0m2.101s 00:06:13.689 sys 0m0.742s 00:06:13.689 05:05:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:13.689 05:05:44 -- common/autotest_common.sh@10 -- # set +x 00:06:13.689 ************************************ 00:06:13.689 END TEST default_locks 00:06:13.689 ************************************ 00:06:13.689 05:05:44 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:13.689 05:05:44 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:13.689 05:05:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:13.689 05:05:44 -- common/autotest_common.sh@10 -- # set +x 00:06:13.689 ************************************ 00:06:13.689 START TEST default_locks_via_rpc 00:06:13.689 ************************************ 00:06:13.689 05:05:44 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:06:13.689 05:05:44 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=3122147 00:06:13.689 05:05:44 -- event/cpu_locks.sh@63 -- # waitforlisten 3122147 00:06:13.689 05:05:44 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:13.689 05:05:44 -- common/autotest_common.sh@819 -- # '[' -z 3122147 ']' 00:06:13.689 05:05:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:13.689 05:05:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:13.689 05:05:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:13.689 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:13.689 05:05:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:13.689 05:05:44 -- common/autotest_common.sh@10 -- # set +x 00:06:13.949 [2024-07-23 05:05:44.787884] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:13.949 [2024-07-23 05:05:44.787972] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3122147 ] 00:06:13.949 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.949 [2024-07-23 05:05:44.884456] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.949 [2024-07-23 05:05:44.961874] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:13.949 [2024-07-23 05:05:44.962015] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.886 05:05:45 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:14.886 05:05:45 -- common/autotest_common.sh@852 -- # return 0 00:06:14.886 05:05:45 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:14.886 05:05:45 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:14.886 05:05:45 -- common/autotest_common.sh@10 -- # set +x 00:06:14.886 05:05:45 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:14.886 05:05:45 -- event/cpu_locks.sh@67 -- # no_locks 00:06:14.886 05:05:45 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:14.886 05:05:45 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:14.886 05:05:45 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:14.886 05:05:45 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:14.886 05:05:45 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:14.886 05:05:45 -- common/autotest_common.sh@10 -- # set +x 00:06:14.886 05:05:45 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:14.886 05:05:45 -- event/cpu_locks.sh@71 -- # locks_exist 3122147 00:06:14.886 05:05:45 -- event/cpu_locks.sh@22 -- # lslocks -p 3122147 00:06:14.886 05:05:45 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:15.145 05:05:46 -- event/cpu_locks.sh@73 -- # killprocess 3122147 00:06:15.145 05:05:46 -- common/autotest_common.sh@926 -- # '[' -z 3122147 ']' 00:06:15.145 05:05:46 -- common/autotest_common.sh@930 -- # kill -0 3122147 00:06:15.145 05:05:46 -- common/autotest_common.sh@931 -- # uname 00:06:15.145 05:05:46 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:15.145 05:05:46 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3122147 00:06:15.145 05:05:46 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:15.145 05:05:46 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:15.145 05:05:46 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3122147' 00:06:15.145 killing process with pid 3122147 00:06:15.145 05:05:46 -- common/autotest_common.sh@945 -- # kill 3122147 00:06:15.145 05:05:46 -- common/autotest_common.sh@950 -- # wait 3122147 00:06:15.404 00:06:15.404 real 0m1.701s 00:06:15.404 user 0m1.819s 00:06:15.404 sys 0m0.597s 00:06:15.404 05:05:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:15.404 05:05:46 -- common/autotest_common.sh@10 -- # set +x 00:06:15.404 ************************************ 00:06:15.404 END TEST default_locks_via_rpc 00:06:15.404 ************************************ 00:06:15.663 05:05:46 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:15.663 05:05:46 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:15.663 05:05:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:15.663 05:05:46 -- common/autotest_common.sh@10 -- # set +x 00:06:15.663 ************************************ 00:06:15.663 START TEST non_locking_app_on_locked_coremask 00:06:15.663 ************************************ 00:06:15.663 05:05:46 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:06:15.663 05:05:46 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=3122447 00:06:15.663 05:05:46 -- event/cpu_locks.sh@81 -- # waitforlisten 3122447 /var/tmp/spdk.sock 00:06:15.663 05:05:46 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:15.663 05:05:46 -- common/autotest_common.sh@819 -- # '[' -z 3122447 ']' 00:06:15.663 05:05:46 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:15.663 05:05:46 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:15.663 05:05:46 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:15.663 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:15.663 05:05:46 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:15.663 05:05:46 -- common/autotest_common.sh@10 -- # set +x 00:06:15.663 [2024-07-23 05:05:46.539394] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:15.663 [2024-07-23 05:05:46.539493] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3122447 ] 00:06:15.663 EAL: No free 2048 kB hugepages reported on node 1 00:06:15.663 [2024-07-23 05:05:46.637422] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.663 [2024-07-23 05:05:46.717192] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:15.663 [2024-07-23 05:05:46.717323] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.600 05:05:47 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:16.600 05:05:47 -- common/autotest_common.sh@852 -- # return 0 00:06:16.600 05:05:47 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=3122577 00:06:16.600 05:05:47 -- event/cpu_locks.sh@85 -- # waitforlisten 3122577 /var/tmp/spdk2.sock 00:06:16.600 05:05:47 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:16.600 05:05:47 -- common/autotest_common.sh@819 -- # '[' -z 3122577 ']' 00:06:16.600 05:05:47 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:16.600 05:05:47 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:16.600 05:05:47 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:16.600 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:16.600 05:05:47 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:16.600 05:05:47 -- common/autotest_common.sh@10 -- # set +x 00:06:16.600 [2024-07-23 05:05:47.476460] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:16.600 [2024-07-23 05:05:47.476536] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3122577 ] 00:06:16.600 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.600 [2024-07-23 05:05:47.601062] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:16.600 [2024-07-23 05:05:47.601100] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.859 [2024-07-23 05:05:47.768787] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.859 [2024-07-23 05:05:47.768924] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.426 05:05:48 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:17.426 05:05:48 -- common/autotest_common.sh@852 -- # return 0 00:06:17.426 05:05:48 -- event/cpu_locks.sh@87 -- # locks_exist 3122447 00:06:17.426 05:05:48 -- event/cpu_locks.sh@22 -- # lslocks -p 3122447 00:06:17.426 05:05:48 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:18.804 lslocks: write error 00:06:18.805 05:05:49 -- event/cpu_locks.sh@89 -- # killprocess 3122447 00:06:18.805 05:05:49 -- common/autotest_common.sh@926 -- # '[' -z 3122447 ']' 00:06:18.805 05:05:49 -- common/autotest_common.sh@930 -- # kill -0 3122447 00:06:18.805 05:05:49 -- common/autotest_common.sh@931 -- # uname 00:06:18.805 05:05:49 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:18.805 05:05:49 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3122447 00:06:18.805 05:05:49 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:18.805 05:05:49 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:18.805 05:05:49 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3122447' 00:06:18.805 killing process with pid 3122447 00:06:18.805 05:05:49 -- common/autotest_common.sh@945 -- # kill 3122447 00:06:18.805 05:05:49 -- common/autotest_common.sh@950 -- # wait 3122447 00:06:19.373 05:05:50 -- event/cpu_locks.sh@90 -- # killprocess 3122577 00:06:19.373 05:05:50 -- common/autotest_common.sh@926 -- # '[' -z 3122577 ']' 00:06:19.373 05:05:50 -- common/autotest_common.sh@930 -- # kill -0 3122577 00:06:19.373 05:05:50 -- common/autotest_common.sh@931 -- # uname 00:06:19.373 05:05:50 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:19.373 05:05:50 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3122577 00:06:19.373 05:05:50 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:19.373 05:05:50 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:19.373 05:05:50 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3122577' 00:06:19.373 killing process with pid 3122577 00:06:19.373 05:05:50 -- common/autotest_common.sh@945 -- # kill 3122577 00:06:19.373 05:05:50 -- common/autotest_common.sh@950 -- # wait 3122577 00:06:19.631 00:06:19.631 real 0m4.200s 00:06:19.631 user 0m4.580s 00:06:19.631 sys 0m1.426s 00:06:19.631 05:05:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.631 05:05:50 -- common/autotest_common.sh@10 -- # set +x 00:06:19.631 ************************************ 00:06:19.631 END TEST non_locking_app_on_locked_coremask 00:06:19.631 ************************************ 00:06:19.890 05:05:50 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:19.890 05:05:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:19.890 05:05:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:19.890 05:05:50 -- common/autotest_common.sh@10 -- # set +x 00:06:19.890 ************************************ 00:06:19.890 START TEST locking_app_on_unlocked_coremask 00:06:19.890 ************************************ 00:06:19.890 05:05:50 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:06:19.890 05:05:50 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=3123256 00:06:19.890 05:05:50 -- event/cpu_locks.sh@99 -- # waitforlisten 3123256 /var/tmp/spdk.sock 00:06:19.890 05:05:50 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:19.890 05:05:50 -- common/autotest_common.sh@819 -- # '[' -z 3123256 ']' 00:06:19.890 05:05:50 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:19.890 05:05:50 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:19.890 05:05:50 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:19.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:19.890 05:05:50 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:19.890 05:05:50 -- common/autotest_common.sh@10 -- # set +x 00:06:19.890 [2024-07-23 05:05:50.785654] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:19.890 [2024-07-23 05:05:50.785727] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3123256 ] 00:06:19.890 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.890 [2024-07-23 05:05:50.882062] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:19.890 [2024-07-23 05:05:50.882099] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.890 [2024-07-23 05:05:50.963086] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:19.890 [2024-07-23 05:05:50.963225] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.826 05:05:51 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:20.826 05:05:51 -- common/autotest_common.sh@852 -- # return 0 00:06:20.826 05:05:51 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=3123309 00:06:20.826 05:05:51 -- event/cpu_locks.sh@103 -- # waitforlisten 3123309 /var/tmp/spdk2.sock 00:06:20.826 05:05:51 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:20.826 05:05:51 -- common/autotest_common.sh@819 -- # '[' -z 3123309 ']' 00:06:20.826 05:05:51 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:20.826 05:05:51 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:20.826 05:05:51 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:20.826 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:20.826 05:05:51 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:20.826 05:05:51 -- common/autotest_common.sh@10 -- # set +x 00:06:20.826 [2024-07-23 05:05:51.728367] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:20.826 [2024-07-23 05:05:51.728450] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3123309 ] 00:06:20.826 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.826 [2024-07-23 05:05:51.850902] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.086 [2024-07-23 05:05:52.017427] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:21.086 [2024-07-23 05:05:52.017590] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.654 05:05:52 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:21.654 05:05:52 -- common/autotest_common.sh@852 -- # return 0 00:06:21.654 05:05:52 -- event/cpu_locks.sh@105 -- # locks_exist 3123309 00:06:21.654 05:05:52 -- event/cpu_locks.sh@22 -- # lslocks -p 3123309 00:06:21.654 05:05:52 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:23.031 lslocks: write error 00:06:23.031 05:05:53 -- event/cpu_locks.sh@107 -- # killprocess 3123256 00:06:23.031 05:05:53 -- common/autotest_common.sh@926 -- # '[' -z 3123256 ']' 00:06:23.031 05:05:53 -- common/autotest_common.sh@930 -- # kill -0 3123256 00:06:23.031 05:05:53 -- common/autotest_common.sh@931 -- # uname 00:06:23.031 05:05:53 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:23.031 05:05:53 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3123256 00:06:23.031 05:05:53 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:23.031 05:05:53 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:23.031 05:05:53 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3123256' 00:06:23.031 killing process with pid 3123256 00:06:23.031 05:05:53 -- common/autotest_common.sh@945 -- # kill 3123256 00:06:23.031 05:05:53 -- common/autotest_common.sh@950 -- # wait 3123256 00:06:23.599 05:05:54 -- event/cpu_locks.sh@108 -- # killprocess 3123309 00:06:23.599 05:05:54 -- common/autotest_common.sh@926 -- # '[' -z 3123309 ']' 00:06:23.599 05:05:54 -- common/autotest_common.sh@930 -- # kill -0 3123309 00:06:23.599 05:05:54 -- common/autotest_common.sh@931 -- # uname 00:06:23.599 05:05:54 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:23.599 05:05:54 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3123309 00:06:23.599 05:05:54 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:23.599 05:05:54 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:23.599 05:05:54 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3123309' 00:06:23.599 killing process with pid 3123309 00:06:23.599 05:05:54 -- common/autotest_common.sh@945 -- # kill 3123309 00:06:23.599 05:05:54 -- common/autotest_common.sh@950 -- # wait 3123309 00:06:23.857 00:06:23.857 real 0m4.002s 00:06:23.857 user 0m4.370s 00:06:23.857 sys 0m1.358s 00:06:23.857 05:05:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.857 05:05:54 -- common/autotest_common.sh@10 -- # set +x 00:06:23.857 ************************************ 00:06:23.857 END TEST locking_app_on_unlocked_coremask 00:06:23.857 ************************************ 00:06:23.857 05:05:54 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:23.857 05:05:54 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:23.857 05:05:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:23.857 05:05:54 -- common/autotest_common.sh@10 -- # set +x 00:06:23.857 ************************************ 00:06:23.857 START TEST locking_app_on_locked_coremask 00:06:23.857 ************************************ 00:06:23.857 05:05:54 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:06:23.857 05:05:54 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=3123883 00:06:23.857 05:05:54 -- event/cpu_locks.sh@116 -- # waitforlisten 3123883 /var/tmp/spdk.sock 00:06:23.857 05:05:54 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:23.857 05:05:54 -- common/autotest_common.sh@819 -- # '[' -z 3123883 ']' 00:06:23.857 05:05:54 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:23.857 05:05:54 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:23.857 05:05:54 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:23.857 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:23.857 05:05:54 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:23.857 05:05:54 -- common/autotest_common.sh@10 -- # set +x 00:06:23.857 [2024-07-23 05:05:54.839716] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:23.857 [2024-07-23 05:05:54.839806] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3123883 ] 00:06:23.857 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.857 [2024-07-23 05:05:54.938971] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.116 [2024-07-23 05:05:55.018032] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:24.116 [2024-07-23 05:05:55.018179] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.685 05:05:55 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:24.685 05:05:55 -- common/autotest_common.sh@852 -- # return 0 00:06:24.685 05:05:55 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=3124148 00:06:24.685 05:05:55 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 3124148 /var/tmp/spdk2.sock 00:06:24.685 05:05:55 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:24.685 05:05:55 -- common/autotest_common.sh@640 -- # local es=0 00:06:24.685 05:05:55 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 3124148 /var/tmp/spdk2.sock 00:06:24.685 05:05:55 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:24.685 05:05:55 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:24.685 05:05:55 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:24.685 05:05:55 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:24.685 05:05:55 -- common/autotest_common.sh@643 -- # waitforlisten 3124148 /var/tmp/spdk2.sock 00:06:24.685 05:05:55 -- common/autotest_common.sh@819 -- # '[' -z 3124148 ']' 00:06:24.685 05:05:55 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:24.685 05:05:55 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:24.685 05:05:55 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:24.685 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:24.685 05:05:55 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:24.685 05:05:55 -- common/autotest_common.sh@10 -- # set +x 00:06:24.944 [2024-07-23 05:05:55.778833] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:24.944 [2024-07-23 05:05:55.778903] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3124148 ] 00:06:24.944 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.944 [2024-07-23 05:05:55.902919] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 3123883 has claimed it. 00:06:24.944 [2024-07-23 05:05:55.902969] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:25.515 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (3124148) - No such process 00:06:25.515 ERROR: process (pid: 3124148) is no longer running 00:06:25.515 05:05:56 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:25.515 05:05:56 -- common/autotest_common.sh@852 -- # return 1 00:06:25.515 05:05:56 -- common/autotest_common.sh@643 -- # es=1 00:06:25.515 05:05:56 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:25.515 05:05:56 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:25.515 05:05:56 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:25.515 05:05:56 -- event/cpu_locks.sh@122 -- # locks_exist 3123883 00:06:25.515 05:05:56 -- event/cpu_locks.sh@22 -- # lslocks -p 3123883 00:06:25.515 05:05:56 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:26.087 lslocks: write error 00:06:26.087 05:05:57 -- event/cpu_locks.sh@124 -- # killprocess 3123883 00:06:26.087 05:05:57 -- common/autotest_common.sh@926 -- # '[' -z 3123883 ']' 00:06:26.087 05:05:57 -- common/autotest_common.sh@930 -- # kill -0 3123883 00:06:26.087 05:05:57 -- common/autotest_common.sh@931 -- # uname 00:06:26.087 05:05:57 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:26.087 05:05:57 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3123883 00:06:26.087 05:05:57 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:26.087 05:05:57 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:26.087 05:05:57 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3123883' 00:06:26.087 killing process with pid 3123883 00:06:26.088 05:05:57 -- common/autotest_common.sh@945 -- # kill 3123883 00:06:26.088 05:05:57 -- common/autotest_common.sh@950 -- # wait 3123883 00:06:26.656 00:06:26.656 real 0m2.652s 00:06:26.656 user 0m2.945s 00:06:26.656 sys 0m0.819s 00:06:26.656 05:05:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:26.656 05:05:57 -- common/autotest_common.sh@10 -- # set +x 00:06:26.656 ************************************ 00:06:26.656 END TEST locking_app_on_locked_coremask 00:06:26.656 ************************************ 00:06:26.656 05:05:57 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:26.656 05:05:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:26.656 05:05:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:26.656 05:05:57 -- common/autotest_common.sh@10 -- # set +x 00:06:26.656 ************************************ 00:06:26.656 START TEST locking_overlapped_coremask 00:06:26.656 ************************************ 00:06:26.656 05:05:57 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:06:26.656 05:05:57 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=3124457 00:06:26.656 05:05:57 -- event/cpu_locks.sh@133 -- # waitforlisten 3124457 /var/tmp/spdk.sock 00:06:26.656 05:05:57 -- common/autotest_common.sh@819 -- # '[' -z 3124457 ']' 00:06:26.656 05:05:57 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:26.656 05:05:57 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:26.656 05:05:57 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:26.656 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:26.657 05:05:57 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:26.657 05:05:57 -- common/autotest_common.sh@10 -- # set +x 00:06:26.657 05:05:57 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:26.657 [2024-07-23 05:05:57.537650] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:26.657 [2024-07-23 05:05:57.537737] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3124457 ] 00:06:26.657 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.657 [2024-07-23 05:05:57.636913] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:26.657 [2024-07-23 05:05:57.725695] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:26.657 [2024-07-23 05:05:57.725852] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:26.657 [2024-07-23 05:05:57.725945] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:26.657 [2024-07-23 05:05:57.725948] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.597 05:05:58 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:27.597 05:05:58 -- common/autotest_common.sh@852 -- # return 0 00:06:27.597 05:05:58 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=3124661 00:06:27.597 05:05:58 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 3124661 /var/tmp/spdk2.sock 00:06:27.597 05:05:58 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:27.597 05:05:58 -- common/autotest_common.sh@640 -- # local es=0 00:06:27.597 05:05:58 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 3124661 /var/tmp/spdk2.sock 00:06:27.597 05:05:58 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:27.597 05:05:58 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:27.597 05:05:58 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:27.597 05:05:58 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:27.597 05:05:58 -- common/autotest_common.sh@643 -- # waitforlisten 3124661 /var/tmp/spdk2.sock 00:06:27.597 05:05:58 -- common/autotest_common.sh@819 -- # '[' -z 3124661 ']' 00:06:27.597 05:05:58 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:27.597 05:05:58 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:27.597 05:05:58 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:27.597 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:27.597 05:05:58 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:27.597 05:05:58 -- common/autotest_common.sh@10 -- # set +x 00:06:27.597 [2024-07-23 05:05:58.479483] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:27.597 [2024-07-23 05:05:58.479573] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3124661 ] 00:06:27.597 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.597 [2024-07-23 05:05:58.584301] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3124457 has claimed it. 00:06:27.597 [2024-07-23 05:05:58.584335] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:28.166 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (3124661) - No such process 00:06:28.166 ERROR: process (pid: 3124661) is no longer running 00:06:28.166 05:05:59 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:28.166 05:05:59 -- common/autotest_common.sh@852 -- # return 1 00:06:28.166 05:05:59 -- common/autotest_common.sh@643 -- # es=1 00:06:28.166 05:05:59 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:28.166 05:05:59 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:28.166 05:05:59 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:28.166 05:05:59 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:28.166 05:05:59 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:28.166 05:05:59 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:28.166 05:05:59 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:28.166 05:05:59 -- event/cpu_locks.sh@141 -- # killprocess 3124457 00:06:28.166 05:05:59 -- common/autotest_common.sh@926 -- # '[' -z 3124457 ']' 00:06:28.166 05:05:59 -- common/autotest_common.sh@930 -- # kill -0 3124457 00:06:28.166 05:05:59 -- common/autotest_common.sh@931 -- # uname 00:06:28.166 05:05:59 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:28.166 05:05:59 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3124457 00:06:28.166 05:05:59 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:28.166 05:05:59 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:28.166 05:05:59 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3124457' 00:06:28.166 killing process with pid 3124457 00:06:28.166 05:05:59 -- common/autotest_common.sh@945 -- # kill 3124457 00:06:28.166 05:05:59 -- common/autotest_common.sh@950 -- # wait 3124457 00:06:28.736 00:06:28.736 real 0m2.057s 00:06:28.736 user 0m5.808s 00:06:28.736 sys 0m0.512s 00:06:28.736 05:05:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:28.736 05:05:59 -- common/autotest_common.sh@10 -- # set +x 00:06:28.736 ************************************ 00:06:28.736 END TEST locking_overlapped_coremask 00:06:28.736 ************************************ 00:06:28.736 05:05:59 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:28.736 05:05:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:28.736 05:05:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:28.736 05:05:59 -- common/autotest_common.sh@10 -- # set +x 00:06:28.736 ************************************ 00:06:28.736 START TEST locking_overlapped_coremask_via_rpc 00:06:28.736 ************************************ 00:06:28.736 05:05:59 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:06:28.736 05:05:59 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=3124776 00:06:28.736 05:05:59 -- event/cpu_locks.sh@149 -- # waitforlisten 3124776 /var/tmp/spdk.sock 00:06:28.736 05:05:59 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:28.736 05:05:59 -- common/autotest_common.sh@819 -- # '[' -z 3124776 ']' 00:06:28.736 05:05:59 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:28.736 05:05:59 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:28.736 05:05:59 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:28.736 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:28.736 05:05:59 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:28.736 05:05:59 -- common/autotest_common.sh@10 -- # set +x 00:06:28.736 [2024-07-23 05:05:59.644147] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:28.736 [2024-07-23 05:05:59.644220] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3124776 ] 00:06:28.736 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.736 [2024-07-23 05:05:59.741561] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:28.736 [2024-07-23 05:05:59.741592] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:28.995 [2024-07-23 05:05:59.830357] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:28.995 [2024-07-23 05:05:59.830519] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:28.995 [2024-07-23 05:05:59.830619] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:28.995 [2024-07-23 05:05:59.830621] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.564 05:06:00 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:29.564 05:06:00 -- common/autotest_common.sh@852 -- # return 0 00:06:29.564 05:06:00 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=3125075 00:06:29.564 05:06:00 -- event/cpu_locks.sh@153 -- # waitforlisten 3125075 /var/tmp/spdk2.sock 00:06:29.565 05:06:00 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:29.565 05:06:00 -- common/autotest_common.sh@819 -- # '[' -z 3125075 ']' 00:06:29.565 05:06:00 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:29.565 05:06:00 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:29.565 05:06:00 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:29.565 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:29.565 05:06:00 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:29.565 05:06:00 -- common/autotest_common.sh@10 -- # set +x 00:06:29.565 [2024-07-23 05:06:00.598109] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:29.565 [2024-07-23 05:06:00.598186] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3125075 ] 00:06:29.565 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.824 [2024-07-23 05:06:00.702611] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:29.824 [2024-07-23 05:06:00.702639] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:29.824 [2024-07-23 05:06:00.847031] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:29.824 [2024-07-23 05:06:00.847175] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:29.824 [2024-07-23 05:06:00.850490] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:29.824 [2024-07-23 05:06:00.850491] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:30.393 05:06:01 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:30.393 05:06:01 -- common/autotest_common.sh@852 -- # return 0 00:06:30.393 05:06:01 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:30.393 05:06:01 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:30.393 05:06:01 -- common/autotest_common.sh@10 -- # set +x 00:06:30.393 05:06:01 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:30.393 05:06:01 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:30.393 05:06:01 -- common/autotest_common.sh@640 -- # local es=0 00:06:30.393 05:06:01 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:30.393 05:06:01 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:06:30.393 05:06:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:30.393 05:06:01 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:06:30.393 05:06:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:30.393 05:06:01 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:30.393 05:06:01 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:30.393 05:06:01 -- common/autotest_common.sh@10 -- # set +x 00:06:30.393 [2024-07-23 05:06:01.482510] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3124776 has claimed it. 00:06:30.651 request: 00:06:30.651 { 00:06:30.651 "method": "framework_enable_cpumask_locks", 00:06:30.651 "req_id": 1 00:06:30.651 } 00:06:30.651 Got JSON-RPC error response 00:06:30.652 response: 00:06:30.652 { 00:06:30.652 "code": -32603, 00:06:30.652 "message": "Failed to claim CPU core: 2" 00:06:30.652 } 00:06:30.652 05:06:01 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:06:30.652 05:06:01 -- common/autotest_common.sh@643 -- # es=1 00:06:30.652 05:06:01 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:30.652 05:06:01 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:30.652 05:06:01 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:30.652 05:06:01 -- event/cpu_locks.sh@158 -- # waitforlisten 3124776 /var/tmp/spdk.sock 00:06:30.652 05:06:01 -- common/autotest_common.sh@819 -- # '[' -z 3124776 ']' 00:06:30.652 05:06:01 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.652 05:06:01 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:30.652 05:06:01 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.652 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.652 05:06:01 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:30.652 05:06:01 -- common/autotest_common.sh@10 -- # set +x 00:06:30.652 05:06:01 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:30.652 05:06:01 -- common/autotest_common.sh@852 -- # return 0 00:06:30.652 05:06:01 -- event/cpu_locks.sh@159 -- # waitforlisten 3125075 /var/tmp/spdk2.sock 00:06:30.652 05:06:01 -- common/autotest_common.sh@819 -- # '[' -z 3125075 ']' 00:06:30.652 05:06:01 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:30.652 05:06:01 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:30.652 05:06:01 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:30.652 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:30.652 05:06:01 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:30.652 05:06:01 -- common/autotest_common.sh@10 -- # set +x 00:06:30.910 05:06:01 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:30.910 05:06:01 -- common/autotest_common.sh@852 -- # return 0 00:06:30.910 05:06:01 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:30.910 05:06:01 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:30.910 05:06:01 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:30.910 05:06:01 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:30.910 00:06:30.910 real 0m2.337s 00:06:30.910 user 0m1.038s 00:06:30.910 sys 0m0.234s 00:06:30.910 05:06:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.910 05:06:01 -- common/autotest_common.sh@10 -- # set +x 00:06:30.910 ************************************ 00:06:30.910 END TEST locking_overlapped_coremask_via_rpc 00:06:30.910 ************************************ 00:06:30.910 05:06:01 -- event/cpu_locks.sh@174 -- # cleanup 00:06:30.910 05:06:01 -- event/cpu_locks.sh@15 -- # [[ -z 3124776 ]] 00:06:30.910 05:06:01 -- event/cpu_locks.sh@15 -- # killprocess 3124776 00:06:30.910 05:06:01 -- common/autotest_common.sh@926 -- # '[' -z 3124776 ']' 00:06:30.910 05:06:01 -- common/autotest_common.sh@930 -- # kill -0 3124776 00:06:30.910 05:06:01 -- common/autotest_common.sh@931 -- # uname 00:06:31.169 05:06:02 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:31.169 05:06:02 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3124776 00:06:31.169 05:06:02 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:31.169 05:06:02 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:31.169 05:06:02 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3124776' 00:06:31.169 killing process with pid 3124776 00:06:31.169 05:06:02 -- common/autotest_common.sh@945 -- # kill 3124776 00:06:31.169 05:06:02 -- common/autotest_common.sh@950 -- # wait 3124776 00:06:31.429 05:06:02 -- event/cpu_locks.sh@16 -- # [[ -z 3125075 ]] 00:06:31.429 05:06:02 -- event/cpu_locks.sh@16 -- # killprocess 3125075 00:06:31.429 05:06:02 -- common/autotest_common.sh@926 -- # '[' -z 3125075 ']' 00:06:31.429 05:06:02 -- common/autotest_common.sh@930 -- # kill -0 3125075 00:06:31.429 05:06:02 -- common/autotest_common.sh@931 -- # uname 00:06:31.429 05:06:02 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:31.429 05:06:02 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3125075 00:06:31.429 05:06:02 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:06:31.429 05:06:02 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:06:31.429 05:06:02 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3125075' 00:06:31.429 killing process with pid 3125075 00:06:31.429 05:06:02 -- common/autotest_common.sh@945 -- # kill 3125075 00:06:31.429 05:06:02 -- common/autotest_common.sh@950 -- # wait 3125075 00:06:31.688 05:06:02 -- event/cpu_locks.sh@18 -- # rm -f 00:06:31.688 05:06:02 -- event/cpu_locks.sh@1 -- # cleanup 00:06:31.688 05:06:02 -- event/cpu_locks.sh@15 -- # [[ -z 3124776 ]] 00:06:31.688 05:06:02 -- event/cpu_locks.sh@15 -- # killprocess 3124776 00:06:31.688 05:06:02 -- common/autotest_common.sh@926 -- # '[' -z 3124776 ']' 00:06:31.688 05:06:02 -- common/autotest_common.sh@930 -- # kill -0 3124776 00:06:31.688 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (3124776) - No such process 00:06:31.688 05:06:02 -- common/autotest_common.sh@953 -- # echo 'Process with pid 3124776 is not found' 00:06:31.688 Process with pid 3124776 is not found 00:06:31.688 05:06:02 -- event/cpu_locks.sh@16 -- # [[ -z 3125075 ]] 00:06:31.688 05:06:02 -- event/cpu_locks.sh@16 -- # killprocess 3125075 00:06:31.688 05:06:02 -- common/autotest_common.sh@926 -- # '[' -z 3125075 ']' 00:06:31.688 05:06:02 -- common/autotest_common.sh@930 -- # kill -0 3125075 00:06:31.688 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (3125075) - No such process 00:06:31.688 05:06:02 -- common/autotest_common.sh@953 -- # echo 'Process with pid 3125075 is not found' 00:06:31.688 Process with pid 3125075 is not found 00:06:31.688 05:06:02 -- event/cpu_locks.sh@18 -- # rm -f 00:06:31.688 00:06:31.688 real 0m20.119s 00:06:31.688 user 0m34.155s 00:06:31.688 sys 0m6.682s 00:06:31.688 05:06:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.688 05:06:02 -- common/autotest_common.sh@10 -- # set +x 00:06:31.688 ************************************ 00:06:31.688 END TEST cpu_locks 00:06:31.688 ************************************ 00:06:31.948 00:06:31.948 real 0m46.503s 00:06:31.948 user 1m26.681s 00:06:31.948 sys 0m11.337s 00:06:31.948 05:06:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.948 05:06:02 -- common/autotest_common.sh@10 -- # set +x 00:06:31.948 ************************************ 00:06:31.948 END TEST event 00:06:31.948 ************************************ 00:06:31.948 05:06:02 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:31.948 05:06:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:31.948 05:06:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.948 05:06:02 -- common/autotest_common.sh@10 -- # set +x 00:06:31.948 ************************************ 00:06:31.948 START TEST thread 00:06:31.948 ************************************ 00:06:31.948 05:06:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:31.948 * Looking for test storage... 00:06:31.948 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:31.948 05:06:02 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:31.948 05:06:02 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:31.948 05:06:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.948 05:06:02 -- common/autotest_common.sh@10 -- # set +x 00:06:31.948 ************************************ 00:06:31.948 START TEST thread_poller_perf 00:06:31.948 ************************************ 00:06:31.948 05:06:02 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:31.948 [2024-07-23 05:06:02.983698] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:31.948 [2024-07-23 05:06:02.983795] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3125653 ] 00:06:31.948 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.207 [2024-07-23 05:06:03.083158] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.207 [2024-07-23 05:06:03.171201] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.207 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:33.587 ====================================== 00:06:33.587 busy:2508356892 (cyc) 00:06:33.587 total_run_count: 549000 00:06:33.587 tsc_hz: 2500000000 (cyc) 00:06:33.587 ====================================== 00:06:33.587 poller_cost: 4568 (cyc), 1827 (nsec) 00:06:33.587 00:06:33.587 real 0m1.283s 00:06:33.587 user 0m1.164s 00:06:33.587 sys 0m0.113s 00:06:33.587 05:06:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.587 05:06:04 -- common/autotest_common.sh@10 -- # set +x 00:06:33.587 ************************************ 00:06:33.587 END TEST thread_poller_perf 00:06:33.587 ************************************ 00:06:33.587 05:06:04 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:33.587 05:06:04 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:33.587 05:06:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:33.587 05:06:04 -- common/autotest_common.sh@10 -- # set +x 00:06:33.587 ************************************ 00:06:33.587 START TEST thread_poller_perf 00:06:33.587 ************************************ 00:06:33.587 05:06:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:33.587 [2024-07-23 05:06:04.318959] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:33.587 [2024-07-23 05:06:04.319053] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3125981 ] 00:06:33.587 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.587 [2024-07-23 05:06:04.417976] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.587 [2024-07-23 05:06:04.504395] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.587 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:34.525 ====================================== 00:06:34.525 busy:2502419380 (cyc) 00:06:34.525 total_run_count: 9576000 00:06:34.525 tsc_hz: 2500000000 (cyc) 00:06:34.525 ====================================== 00:06:34.525 poller_cost: 261 (cyc), 104 (nsec) 00:06:34.525 00:06:34.525 real 0m1.277s 00:06:34.525 user 0m1.157s 00:06:34.525 sys 0m0.114s 00:06:34.525 05:06:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.525 05:06:05 -- common/autotest_common.sh@10 -- # set +x 00:06:34.525 ************************************ 00:06:34.525 END TEST thread_poller_perf 00:06:34.525 ************************************ 00:06:34.784 05:06:05 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:34.784 05:06:05 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:34.784 05:06:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:34.784 05:06:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:34.784 05:06:05 -- common/autotest_common.sh@10 -- # set +x 00:06:34.784 ************************************ 00:06:34.784 START TEST thread_spdk_lock 00:06:34.784 ************************************ 00:06:34.784 05:06:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:34.784 [2024-07-23 05:06:05.646918] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:34.784 [2024-07-23 05:06:05.647014] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3126493 ] 00:06:34.784 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.784 [2024-07-23 05:06:05.747729] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:34.784 [2024-07-23 05:06:05.833130] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:34.784 [2024-07-23 05:06:05.833136] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.353 [2024-07-23 05:06:06.339365] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.353 [2024-07-23 05:06:06.339419] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:35.353 [2024-07-23 05:06:06.339434] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x149c080 00:06:35.353 [2024-07-23 05:06:06.340414] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.353 [2024-07-23 05:06:06.340520] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.353 [2024-07-23 05:06:06.340546] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.353 Starting test contend 00:06:35.353 Worker Delay Wait us Hold us Total us 00:06:35.353 0 3 160464 191396 351860 00:06:35.353 1 5 84386 290742 375129 00:06:35.353 PASS test contend 00:06:35.353 Starting test hold_by_poller 00:06:35.353 PASS test hold_by_poller 00:06:35.353 Starting test hold_by_message 00:06:35.353 PASS test hold_by_message 00:06:35.353 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:35.353 100014 assertions passed 00:06:35.353 0 assertions failed 00:06:35.353 00:06:35.353 real 0m0.781s 00:06:35.353 user 0m1.162s 00:06:35.353 sys 0m0.121s 00:06:35.353 05:06:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.353 05:06:06 -- common/autotest_common.sh@10 -- # set +x 00:06:35.353 ************************************ 00:06:35.353 END TEST thread_spdk_lock 00:06:35.353 ************************************ 00:06:35.613 00:06:35.613 real 0m3.594s 00:06:35.613 user 0m3.574s 00:06:35.613 sys 0m0.548s 00:06:35.613 05:06:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.613 05:06:06 -- common/autotest_common.sh@10 -- # set +x 00:06:35.613 ************************************ 00:06:35.613 END TEST thread 00:06:35.613 ************************************ 00:06:35.613 05:06:06 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:35.613 05:06:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:35.613 05:06:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:35.613 05:06:06 -- common/autotest_common.sh@10 -- # set +x 00:06:35.613 ************************************ 00:06:35.613 START TEST accel 00:06:35.613 ************************************ 00:06:35.613 05:06:06 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:35.613 * Looking for test storage... 00:06:35.613 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:35.613 05:06:06 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:35.613 05:06:06 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:35.613 05:06:06 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:35.613 05:06:06 -- accel/accel.sh@59 -- # spdk_tgt_pid=3126858 00:06:35.613 05:06:06 -- accel/accel.sh@60 -- # waitforlisten 3126858 00:06:35.613 05:06:06 -- common/autotest_common.sh@819 -- # '[' -z 3126858 ']' 00:06:35.613 05:06:06 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:35.613 05:06:06 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:35.613 05:06:06 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:35.613 05:06:06 -- accel/accel.sh@58 -- # build_accel_config 00:06:35.613 05:06:06 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:35.613 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:35.613 05:06:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.613 05:06:06 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:35.613 05:06:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.613 05:06:06 -- common/autotest_common.sh@10 -- # set +x 00:06:35.613 05:06:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.613 05:06:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.613 05:06:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.613 05:06:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.613 05:06:06 -- accel/accel.sh@42 -- # jq -r . 00:06:35.613 [2024-07-23 05:06:06.633435] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:35.613 [2024-07-23 05:06:06.633510] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3126858 ] 00:06:35.613 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.871 [2024-07-23 05:06:06.730831] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.871 [2024-07-23 05:06:06.815916] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:35.871 [2024-07-23 05:06:06.816054] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.809 05:06:07 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:36.809 05:06:07 -- common/autotest_common.sh@852 -- # return 0 00:06:36.809 05:06:07 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:36.809 05:06:07 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:36.809 05:06:07 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:36.809 05:06:07 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:36.809 05:06:07 -- common/autotest_common.sh@10 -- # set +x 00:06:36.809 05:06:07 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:36.809 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.809 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.809 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.809 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.809 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.809 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.809 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.809 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.809 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.809 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # IFS== 00:06:36.810 05:06:07 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.810 05:06:07 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.810 05:06:07 -- accel/accel.sh@67 -- # killprocess 3126858 00:06:36.810 05:06:07 -- common/autotest_common.sh@926 -- # '[' -z 3126858 ']' 00:06:36.810 05:06:07 -- common/autotest_common.sh@930 -- # kill -0 3126858 00:06:36.810 05:06:07 -- common/autotest_common.sh@931 -- # uname 00:06:36.810 05:06:07 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:36.810 05:06:07 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3126858 00:06:36.810 05:06:07 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:36.810 05:06:07 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:36.810 05:06:07 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3126858' 00:06:36.810 killing process with pid 3126858 00:06:36.810 05:06:07 -- common/autotest_common.sh@945 -- # kill 3126858 00:06:36.810 05:06:07 -- common/autotest_common.sh@950 -- # wait 3126858 00:06:37.070 05:06:07 -- accel/accel.sh@68 -- # trap - ERR 00:06:37.070 05:06:07 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:37.070 05:06:07 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:06:37.070 05:06:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.070 05:06:07 -- common/autotest_common.sh@10 -- # set +x 00:06:37.070 05:06:07 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:06:37.070 05:06:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:37.070 05:06:07 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.070 05:06:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.070 05:06:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.070 05:06:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.070 05:06:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.070 05:06:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.070 05:06:07 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.070 05:06:07 -- accel/accel.sh@42 -- # jq -r . 00:06:37.070 05:06:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.070 05:06:08 -- common/autotest_common.sh@10 -- # set +x 00:06:37.070 05:06:08 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:37.070 05:06:08 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:37.070 05:06:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.070 05:06:08 -- common/autotest_common.sh@10 -- # set +x 00:06:37.070 ************************************ 00:06:37.070 START TEST accel_missing_filename 00:06:37.070 ************************************ 00:06:37.070 05:06:08 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:06:37.070 05:06:08 -- common/autotest_common.sh@640 -- # local es=0 00:06:37.070 05:06:08 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:37.070 05:06:08 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:37.070 05:06:08 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.070 05:06:08 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:37.070 05:06:08 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.070 05:06:08 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:06:37.070 05:06:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:37.070 05:06:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.070 05:06:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.070 05:06:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.070 05:06:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.070 05:06:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.070 05:06:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.070 05:06:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.070 05:06:08 -- accel/accel.sh@42 -- # jq -r . 00:06:37.070 [2024-07-23 05:06:08.080516] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:37.070 [2024-07-23 05:06:08.080610] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3127163 ] 00:06:37.070 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.329 [2024-07-23 05:06:08.178136] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.329 [2024-07-23 05:06:08.260418] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.329 [2024-07-23 05:06:08.302803] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:37.329 [2024-07-23 05:06:08.365212] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:37.589 A filename is required. 00:06:37.589 05:06:08 -- common/autotest_common.sh@643 -- # es=234 00:06:37.589 05:06:08 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:37.589 05:06:08 -- common/autotest_common.sh@652 -- # es=106 00:06:37.589 05:06:08 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:37.589 05:06:08 -- common/autotest_common.sh@660 -- # es=1 00:06:37.589 05:06:08 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:37.589 00:06:37.589 real 0m0.385s 00:06:37.589 user 0m0.269s 00:06:37.589 sys 0m0.157s 00:06:37.589 05:06:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.589 05:06:08 -- common/autotest_common.sh@10 -- # set +x 00:06:37.589 ************************************ 00:06:37.589 END TEST accel_missing_filename 00:06:37.589 ************************************ 00:06:37.589 05:06:08 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.589 05:06:08 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:37.589 05:06:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.589 05:06:08 -- common/autotest_common.sh@10 -- # set +x 00:06:37.589 ************************************ 00:06:37.589 START TEST accel_compress_verify 00:06:37.589 ************************************ 00:06:37.589 05:06:08 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.589 05:06:08 -- common/autotest_common.sh@640 -- # local es=0 00:06:37.589 05:06:08 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.589 05:06:08 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:37.589 05:06:08 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.589 05:06:08 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:37.589 05:06:08 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.589 05:06:08 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.589 05:06:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.589 05:06:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.589 05:06:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.589 05:06:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.589 05:06:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.589 05:06:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.589 05:06:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.589 05:06:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.589 05:06:08 -- accel/accel.sh@42 -- # jq -r . 00:06:37.589 [2024-07-23 05:06:08.513600] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:37.589 [2024-07-23 05:06:08.513687] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3127182 ] 00:06:37.589 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.589 [2024-07-23 05:06:08.611482] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.848 [2024-07-23 05:06:08.694591] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.848 [2024-07-23 05:06:08.737000] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:37.848 [2024-07-23 05:06:08.799371] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:37.848 00:06:37.848 Compression does not support the verify option, aborting. 00:06:37.848 05:06:08 -- common/autotest_common.sh@643 -- # es=161 00:06:37.848 05:06:08 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:37.848 05:06:08 -- common/autotest_common.sh@652 -- # es=33 00:06:37.848 05:06:08 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:37.848 05:06:08 -- common/autotest_common.sh@660 -- # es=1 00:06:37.848 05:06:08 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:37.848 00:06:37.848 real 0m0.386s 00:06:37.848 user 0m0.275s 00:06:37.848 sys 0m0.150s 00:06:37.848 05:06:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.848 05:06:08 -- common/autotest_common.sh@10 -- # set +x 00:06:37.848 ************************************ 00:06:37.848 END TEST accel_compress_verify 00:06:37.848 ************************************ 00:06:37.848 05:06:08 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:37.848 05:06:08 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:37.848 05:06:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.848 05:06:08 -- common/autotest_common.sh@10 -- # set +x 00:06:37.848 ************************************ 00:06:37.848 START TEST accel_wrong_workload 00:06:37.848 ************************************ 00:06:37.848 05:06:08 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:06:37.848 05:06:08 -- common/autotest_common.sh@640 -- # local es=0 00:06:37.848 05:06:08 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:37.848 05:06:08 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:37.848 05:06:08 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.848 05:06:08 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:37.848 05:06:08 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:37.848 05:06:08 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:06:37.848 05:06:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:37.848 05:06:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.848 05:06:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.848 05:06:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.848 05:06:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.848 05:06:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.848 05:06:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.848 05:06:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.848 05:06:08 -- accel/accel.sh@42 -- # jq -r . 00:06:38.118 Unsupported workload type: foobar 00:06:38.118 [2024-07-23 05:06:08.944047] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:38.118 accel_perf options: 00:06:38.118 [-h help message] 00:06:38.118 [-q queue depth per core] 00:06:38.118 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:38.118 [-T number of threads per core 00:06:38.118 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:38.118 [-t time in seconds] 00:06:38.118 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:38.118 [ dif_verify, , dif_generate, dif_generate_copy 00:06:38.118 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:38.118 [-l for compress/decompress workloads, name of uncompressed input file 00:06:38.118 [-S for crc32c workload, use this seed value (default 0) 00:06:38.118 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:38.118 [-f for fill workload, use this BYTE value (default 255) 00:06:38.118 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:38.118 [-y verify result if this switch is on] 00:06:38.118 [-a tasks to allocate per core (default: same value as -q)] 00:06:38.118 Can be used to spread operations across a wider range of memory. 00:06:38.118 05:06:08 -- common/autotest_common.sh@643 -- # es=1 00:06:38.118 05:06:08 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:38.118 05:06:08 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:38.118 05:06:08 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:38.118 00:06:38.118 real 0m0.028s 00:06:38.118 user 0m0.008s 00:06:38.118 sys 0m0.020s 00:06:38.118 05:06:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:38.118 05:06:08 -- common/autotest_common.sh@10 -- # set +x 00:06:38.118 ************************************ 00:06:38.118 END TEST accel_wrong_workload 00:06:38.118 ************************************ 00:06:38.118 Error: writing output failed: Broken pipe 00:06:38.118 05:06:08 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:38.118 05:06:08 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:38.118 05:06:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:38.118 05:06:08 -- common/autotest_common.sh@10 -- # set +x 00:06:38.118 ************************************ 00:06:38.118 START TEST accel_negative_buffers 00:06:38.118 ************************************ 00:06:38.118 05:06:08 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:38.118 05:06:08 -- common/autotest_common.sh@640 -- # local es=0 00:06:38.118 05:06:08 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:38.118 05:06:08 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:38.118 05:06:08 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:38.118 05:06:08 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:38.118 05:06:08 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:38.118 05:06:09 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:06:38.118 05:06:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:38.118 05:06:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.118 05:06:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.118 05:06:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.118 05:06:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.118 05:06:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.118 05:06:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.118 05:06:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.118 05:06:09 -- accel/accel.sh@42 -- # jq -r . 00:06:38.118 -x option must be non-negative. 00:06:38.118 [2024-07-23 05:06:09.020967] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:38.118 accel_perf options: 00:06:38.118 [-h help message] 00:06:38.118 [-q queue depth per core] 00:06:38.118 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:38.118 [-T number of threads per core 00:06:38.118 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:38.118 [-t time in seconds] 00:06:38.118 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:38.118 [ dif_verify, , dif_generate, dif_generate_copy 00:06:38.118 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:38.118 [-l for compress/decompress workloads, name of uncompressed input file 00:06:38.118 [-S for crc32c workload, use this seed value (default 0) 00:06:38.118 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:38.118 [-f for fill workload, use this BYTE value (default 255) 00:06:38.118 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:38.118 [-y verify result if this switch is on] 00:06:38.118 [-a tasks to allocate per core (default: same value as -q)] 00:06:38.118 Can be used to spread operations across a wider range of memory. 00:06:38.118 05:06:09 -- common/autotest_common.sh@643 -- # es=1 00:06:38.118 05:06:09 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:38.118 05:06:09 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:38.118 05:06:09 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:38.118 00:06:38.118 real 0m0.029s 00:06:38.118 user 0m0.014s 00:06:38.118 sys 0m0.015s 00:06:38.118 05:06:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:38.118 05:06:09 -- common/autotest_common.sh@10 -- # set +x 00:06:38.118 ************************************ 00:06:38.118 END TEST accel_negative_buffers 00:06:38.118 ************************************ 00:06:38.118 Error: writing output failed: Broken pipe 00:06:38.118 05:06:09 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:38.118 05:06:09 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:38.118 05:06:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:38.118 05:06:09 -- common/autotest_common.sh@10 -- # set +x 00:06:38.118 ************************************ 00:06:38.118 START TEST accel_crc32c 00:06:38.118 ************************************ 00:06:38.118 05:06:09 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:38.118 05:06:09 -- accel/accel.sh@16 -- # local accel_opc 00:06:38.118 05:06:09 -- accel/accel.sh@17 -- # local accel_module 00:06:38.118 05:06:09 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:38.118 05:06:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:38.118 05:06:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.118 05:06:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.118 05:06:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.118 05:06:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.118 05:06:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.118 05:06:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.118 05:06:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.119 05:06:09 -- accel/accel.sh@42 -- # jq -r . 00:06:38.119 [2024-07-23 05:06:09.094693] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:38.119 [2024-07-23 05:06:09.094778] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3127353 ] 00:06:38.119 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.119 [2024-07-23 05:06:09.194125] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.408 [2024-07-23 05:06:09.282573] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.788 05:06:10 -- accel/accel.sh@18 -- # out=' 00:06:39.788 SPDK Configuration: 00:06:39.788 Core mask: 0x1 00:06:39.789 00:06:39.789 Accel Perf Configuration: 00:06:39.789 Workload Type: crc32c 00:06:39.789 CRC-32C seed: 32 00:06:39.789 Transfer size: 4096 bytes 00:06:39.789 Vector count 1 00:06:39.789 Module: software 00:06:39.789 Queue depth: 32 00:06:39.789 Allocate depth: 32 00:06:39.789 # threads/core: 1 00:06:39.789 Run time: 1 seconds 00:06:39.789 Verify: Yes 00:06:39.789 00:06:39.789 Running for 1 seconds... 00:06:39.789 00:06:39.789 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:39.789 ------------------------------------------------------------------------------------ 00:06:39.789 0,0 572672/s 2237 MiB/s 0 0 00:06:39.789 ==================================================================================== 00:06:39.789 Total 572672/s 2237 MiB/s 0 0' 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:39.789 05:06:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:39.789 05:06:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.789 05:06:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.789 05:06:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.789 05:06:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.789 05:06:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.789 05:06:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.789 05:06:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.789 05:06:10 -- accel/accel.sh@42 -- # jq -r . 00:06:39.789 [2024-07-23 05:06:10.485291] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:39.789 [2024-07-23 05:06:10.485381] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3127543 ] 00:06:39.789 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.789 [2024-07-23 05:06:10.584035] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.789 [2024-07-23 05:06:10.667105] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val= 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val= 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val=0x1 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val= 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val= 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val=crc32c 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val=32 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val= 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val=software 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@23 -- # accel_module=software 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val=32 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val=32 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val=1 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val=Yes 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val= 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:39.789 05:06:10 -- accel/accel.sh@21 -- # val= 00:06:39.789 05:06:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # IFS=: 00:06:39.789 05:06:10 -- accel/accel.sh@20 -- # read -r var val 00:06:41.168 05:06:11 -- accel/accel.sh@21 -- # val= 00:06:41.168 05:06:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.168 05:06:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.168 05:06:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.168 05:06:11 -- accel/accel.sh@21 -- # val= 00:06:41.168 05:06:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.168 05:06:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.168 05:06:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.168 05:06:11 -- accel/accel.sh@21 -- # val= 00:06:41.168 05:06:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.168 05:06:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.168 05:06:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.169 05:06:11 -- accel/accel.sh@21 -- # val= 00:06:41.169 05:06:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.169 05:06:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.169 05:06:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.169 05:06:11 -- accel/accel.sh@21 -- # val= 00:06:41.169 05:06:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.169 05:06:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.169 05:06:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.169 05:06:11 -- accel/accel.sh@21 -- # val= 00:06:41.169 05:06:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.169 05:06:11 -- accel/accel.sh@20 -- # IFS=: 00:06:41.169 05:06:11 -- accel/accel.sh@20 -- # read -r var val 00:06:41.169 05:06:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:41.169 05:06:11 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:41.169 05:06:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:41.169 00:06:41.169 real 0m2.782s 00:06:41.169 user 0m2.494s 00:06:41.169 sys 0m0.294s 00:06:41.169 05:06:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:41.169 05:06:11 -- common/autotest_common.sh@10 -- # set +x 00:06:41.169 ************************************ 00:06:41.169 END TEST accel_crc32c 00:06:41.169 ************************************ 00:06:41.169 05:06:11 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:41.169 05:06:11 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:41.169 05:06:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:41.169 05:06:11 -- common/autotest_common.sh@10 -- # set +x 00:06:41.169 ************************************ 00:06:41.169 START TEST accel_crc32c_C2 00:06:41.169 ************************************ 00:06:41.169 05:06:11 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:41.169 05:06:11 -- accel/accel.sh@16 -- # local accel_opc 00:06:41.169 05:06:11 -- accel/accel.sh@17 -- # local accel_module 00:06:41.169 05:06:11 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:41.169 05:06:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:41.169 05:06:11 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.169 05:06:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.169 05:06:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.169 05:06:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.169 05:06:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.169 05:06:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.169 05:06:11 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.169 05:06:11 -- accel/accel.sh@42 -- # jq -r . 00:06:41.169 [2024-07-23 05:06:11.926819] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:41.169 [2024-07-23 05:06:11.926917] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3127807 ] 00:06:41.169 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.169 [2024-07-23 05:06:12.026877] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.169 [2024-07-23 05:06:12.111023] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.547 05:06:13 -- accel/accel.sh@18 -- # out=' 00:06:42.547 SPDK Configuration: 00:06:42.547 Core mask: 0x1 00:06:42.547 00:06:42.547 Accel Perf Configuration: 00:06:42.547 Workload Type: crc32c 00:06:42.547 CRC-32C seed: 0 00:06:42.547 Transfer size: 4096 bytes 00:06:42.547 Vector count 2 00:06:42.547 Module: software 00:06:42.547 Queue depth: 32 00:06:42.547 Allocate depth: 32 00:06:42.547 # threads/core: 1 00:06:42.547 Run time: 1 seconds 00:06:42.547 Verify: Yes 00:06:42.547 00:06:42.547 Running for 1 seconds... 00:06:42.547 00:06:42.547 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:42.547 ------------------------------------------------------------------------------------ 00:06:42.547 0,0 417088/s 3258 MiB/s 0 0 00:06:42.547 ==================================================================================== 00:06:42.547 Total 417088/s 1629 MiB/s 0 0' 00:06:42.547 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.547 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.547 05:06:13 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:42.547 05:06:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:42.547 05:06:13 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.547 05:06:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.547 05:06:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.547 05:06:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.547 05:06:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.547 05:06:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.547 05:06:13 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.547 05:06:13 -- accel/accel.sh@42 -- # jq -r . 00:06:42.547 [2024-07-23 05:06:13.314897] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:42.547 [2024-07-23 05:06:13.314995] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3128073 ] 00:06:42.547 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.547 [2024-07-23 05:06:13.414432] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.547 [2024-07-23 05:06:13.496142] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.547 05:06:13 -- accel/accel.sh@21 -- # val= 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val= 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val=0x1 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val= 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val= 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val=crc32c 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val=0 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val= 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val=software 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@23 -- # accel_module=software 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val=32 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val=32 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val=1 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val=Yes 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val= 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:42.548 05:06:13 -- accel/accel.sh@21 -- # val= 00:06:42.548 05:06:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # IFS=: 00:06:42.548 05:06:13 -- accel/accel.sh@20 -- # read -r var val 00:06:43.926 05:06:14 -- accel/accel.sh@21 -- # val= 00:06:43.926 05:06:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # IFS=: 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # read -r var val 00:06:43.926 05:06:14 -- accel/accel.sh@21 -- # val= 00:06:43.926 05:06:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # IFS=: 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # read -r var val 00:06:43.926 05:06:14 -- accel/accel.sh@21 -- # val= 00:06:43.926 05:06:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # IFS=: 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # read -r var val 00:06:43.926 05:06:14 -- accel/accel.sh@21 -- # val= 00:06:43.926 05:06:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # IFS=: 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # read -r var val 00:06:43.926 05:06:14 -- accel/accel.sh@21 -- # val= 00:06:43.926 05:06:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # IFS=: 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # read -r var val 00:06:43.926 05:06:14 -- accel/accel.sh@21 -- # val= 00:06:43.926 05:06:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # IFS=: 00:06:43.926 05:06:14 -- accel/accel.sh@20 -- # read -r var val 00:06:43.926 05:06:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:43.926 05:06:14 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:43.926 05:06:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:43.926 00:06:43.926 real 0m2.781s 00:06:43.926 user 0m2.470s 00:06:43.926 sys 0m0.317s 00:06:43.926 05:06:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:43.926 05:06:14 -- common/autotest_common.sh@10 -- # set +x 00:06:43.926 ************************************ 00:06:43.926 END TEST accel_crc32c_C2 00:06:43.926 ************************************ 00:06:43.926 05:06:14 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:43.926 05:06:14 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:43.926 05:06:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:43.926 05:06:14 -- common/autotest_common.sh@10 -- # set +x 00:06:43.926 ************************************ 00:06:43.926 START TEST accel_copy 00:06:43.926 ************************************ 00:06:43.926 05:06:14 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:06:43.926 05:06:14 -- accel/accel.sh@16 -- # local accel_opc 00:06:43.926 05:06:14 -- accel/accel.sh@17 -- # local accel_module 00:06:43.926 05:06:14 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:43.926 05:06:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:43.926 05:06:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.926 05:06:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.926 05:06:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.926 05:06:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.926 05:06:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.926 05:06:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.926 05:06:14 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.926 05:06:14 -- accel/accel.sh@42 -- # jq -r . 00:06:43.927 [2024-07-23 05:06:14.756046] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:43.927 [2024-07-23 05:06:14.756127] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3128364 ] 00:06:43.927 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.927 [2024-07-23 05:06:14.853926] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.927 [2024-07-23 05:06:14.936916] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.305 05:06:16 -- accel/accel.sh@18 -- # out=' 00:06:45.305 SPDK Configuration: 00:06:45.305 Core mask: 0x1 00:06:45.305 00:06:45.305 Accel Perf Configuration: 00:06:45.305 Workload Type: copy 00:06:45.305 Transfer size: 4096 bytes 00:06:45.305 Vector count 1 00:06:45.305 Module: software 00:06:45.305 Queue depth: 32 00:06:45.305 Allocate depth: 32 00:06:45.305 # threads/core: 1 00:06:45.305 Run time: 1 seconds 00:06:45.305 Verify: Yes 00:06:45.305 00:06:45.305 Running for 1 seconds... 00:06:45.305 00:06:45.305 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:45.305 ------------------------------------------------------------------------------------ 00:06:45.305 0,0 370464/s 1447 MiB/s 0 0 00:06:45.305 ==================================================================================== 00:06:45.305 Total 370464/s 1447 MiB/s 0 0' 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:45.305 05:06:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:45.305 05:06:16 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.305 05:06:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.305 05:06:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.305 05:06:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.305 05:06:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.305 05:06:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.305 05:06:16 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.305 05:06:16 -- accel/accel.sh@42 -- # jq -r . 00:06:45.305 [2024-07-23 05:06:16.142424] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:45.305 [2024-07-23 05:06:16.142594] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3128633 ] 00:06:45.305 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.305 [2024-07-23 05:06:16.238448] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.305 [2024-07-23 05:06:16.319840] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val= 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val= 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val=0x1 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val= 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val= 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val=copy 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val= 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val=software 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@23 -- # accel_module=software 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val=32 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val=32 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val=1 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val=Yes 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.305 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.305 05:06:16 -- accel/accel.sh@21 -- # val= 00:06:45.305 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.306 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.306 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:45.306 05:06:16 -- accel/accel.sh@21 -- # val= 00:06:45.306 05:06:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.306 05:06:16 -- accel/accel.sh@20 -- # IFS=: 00:06:45.306 05:06:16 -- accel/accel.sh@20 -- # read -r var val 00:06:46.682 05:06:17 -- accel/accel.sh@21 -- # val= 00:06:46.682 05:06:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.682 05:06:17 -- accel/accel.sh@20 -- # IFS=: 00:06:46.682 05:06:17 -- accel/accel.sh@20 -- # read -r var val 00:06:46.682 05:06:17 -- accel/accel.sh@21 -- # val= 00:06:46.682 05:06:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.682 05:06:17 -- accel/accel.sh@20 -- # IFS=: 00:06:46.682 05:06:17 -- accel/accel.sh@20 -- # read -r var val 00:06:46.682 05:06:17 -- accel/accel.sh@21 -- # val= 00:06:46.682 05:06:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.682 05:06:17 -- accel/accel.sh@20 -- # IFS=: 00:06:46.682 05:06:17 -- accel/accel.sh@20 -- # read -r var val 00:06:46.682 05:06:17 -- accel/accel.sh@21 -- # val= 00:06:46.682 05:06:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.682 05:06:17 -- accel/accel.sh@20 -- # IFS=: 00:06:46.682 05:06:17 -- accel/accel.sh@20 -- # read -r var val 00:06:46.682 05:06:17 -- accel/accel.sh@21 -- # val= 00:06:46.682 05:06:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.682 05:06:17 -- accel/accel.sh@20 -- # IFS=: 00:06:46.682 05:06:17 -- accel/accel.sh@20 -- # read -r var val 00:06:46.682 05:06:17 -- accel/accel.sh@21 -- # val= 00:06:46.683 05:06:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.683 05:06:17 -- accel/accel.sh@20 -- # IFS=: 00:06:46.683 05:06:17 -- accel/accel.sh@20 -- # read -r var val 00:06:46.683 05:06:17 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:46.683 05:06:17 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:46.683 05:06:17 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.683 00:06:46.683 real 0m2.774s 00:06:46.683 user 0m2.460s 00:06:46.683 sys 0m0.319s 00:06:46.683 05:06:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.683 05:06:17 -- common/autotest_common.sh@10 -- # set +x 00:06:46.683 ************************************ 00:06:46.683 END TEST accel_copy 00:06:46.683 ************************************ 00:06:46.683 05:06:17 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.683 05:06:17 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:46.683 05:06:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:46.683 05:06:17 -- common/autotest_common.sh@10 -- # set +x 00:06:46.683 ************************************ 00:06:46.683 START TEST accel_fill 00:06:46.683 ************************************ 00:06:46.683 05:06:17 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.683 05:06:17 -- accel/accel.sh@16 -- # local accel_opc 00:06:46.683 05:06:17 -- accel/accel.sh@17 -- # local accel_module 00:06:46.683 05:06:17 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.683 05:06:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.683 05:06:17 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.683 05:06:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.683 05:06:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.683 05:06:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.683 05:06:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.683 05:06:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.683 05:06:17 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.683 05:06:17 -- accel/accel.sh@42 -- # jq -r . 00:06:46.683 [2024-07-23 05:06:17.578135] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:46.683 [2024-07-23 05:06:17.578215] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3128916 ] 00:06:46.683 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.683 [2024-07-23 05:06:17.674772] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.683 [2024-07-23 05:06:17.757355] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.063 05:06:18 -- accel/accel.sh@18 -- # out=' 00:06:48.063 SPDK Configuration: 00:06:48.063 Core mask: 0x1 00:06:48.063 00:06:48.063 Accel Perf Configuration: 00:06:48.063 Workload Type: fill 00:06:48.063 Fill pattern: 0x80 00:06:48.063 Transfer size: 4096 bytes 00:06:48.063 Vector count 1 00:06:48.063 Module: software 00:06:48.063 Queue depth: 64 00:06:48.063 Allocate depth: 64 00:06:48.063 # threads/core: 1 00:06:48.063 Run time: 1 seconds 00:06:48.063 Verify: Yes 00:06:48.063 00:06:48.063 Running for 1 seconds... 00:06:48.063 00:06:48.063 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:48.063 ------------------------------------------------------------------------------------ 00:06:48.063 0,0 660224/s 2579 MiB/s 0 0 00:06:48.063 ==================================================================================== 00:06:48.063 Total 660224/s 2579 MiB/s 0 0' 00:06:48.063 05:06:18 -- accel/accel.sh@20 -- # IFS=: 00:06:48.063 05:06:18 -- accel/accel.sh@20 -- # read -r var val 00:06:48.063 05:06:18 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:48.064 05:06:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:48.064 05:06:18 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.064 05:06:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:48.064 05:06:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.064 05:06:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.064 05:06:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:48.064 05:06:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:48.064 05:06:18 -- accel/accel.sh@41 -- # local IFS=, 00:06:48.064 05:06:18 -- accel/accel.sh@42 -- # jq -r . 00:06:48.064 [2024-07-23 05:06:18.960067] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:48.064 [2024-07-23 05:06:18.960161] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3129188 ] 00:06:48.064 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.064 [2024-07-23 05:06:19.058762] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.064 [2024-07-23 05:06:19.140173] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.323 05:06:19 -- accel/accel.sh@21 -- # val= 00:06:48.323 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.323 05:06:19 -- accel/accel.sh@21 -- # val= 00:06:48.323 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.323 05:06:19 -- accel/accel.sh@21 -- # val=0x1 00:06:48.323 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.323 05:06:19 -- accel/accel.sh@21 -- # val= 00:06:48.323 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.323 05:06:19 -- accel/accel.sh@21 -- # val= 00:06:48.323 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.323 05:06:19 -- accel/accel.sh@21 -- # val=fill 00:06:48.323 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.323 05:06:19 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.323 05:06:19 -- accel/accel.sh@21 -- # val=0x80 00:06:48.323 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.323 05:06:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:48.323 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.323 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.323 05:06:19 -- accel/accel.sh@21 -- # val= 00:06:48.323 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.324 05:06:19 -- accel/accel.sh@21 -- # val=software 00:06:48.324 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.324 05:06:19 -- accel/accel.sh@23 -- # accel_module=software 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.324 05:06:19 -- accel/accel.sh@21 -- # val=64 00:06:48.324 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.324 05:06:19 -- accel/accel.sh@21 -- # val=64 00:06:48.324 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.324 05:06:19 -- accel/accel.sh@21 -- # val=1 00:06:48.324 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.324 05:06:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:48.324 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.324 05:06:19 -- accel/accel.sh@21 -- # val=Yes 00:06:48.324 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.324 05:06:19 -- accel/accel.sh@21 -- # val= 00:06:48.324 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:48.324 05:06:19 -- accel/accel.sh@21 -- # val= 00:06:48.324 05:06:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # IFS=: 00:06:48.324 05:06:19 -- accel/accel.sh@20 -- # read -r var val 00:06:49.262 05:06:20 -- accel/accel.sh@21 -- # val= 00:06:49.262 05:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:49.262 05:06:20 -- accel/accel.sh@21 -- # val= 00:06:49.262 05:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:49.262 05:06:20 -- accel/accel.sh@21 -- # val= 00:06:49.262 05:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:49.262 05:06:20 -- accel/accel.sh@21 -- # val= 00:06:49.262 05:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:49.262 05:06:20 -- accel/accel.sh@21 -- # val= 00:06:49.262 05:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:49.262 05:06:20 -- accel/accel.sh@21 -- # val= 00:06:49.262 05:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:49.262 05:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:49.262 05:06:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:49.262 05:06:20 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:49.262 05:06:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.262 00:06:49.262 real 0m2.773s 00:06:49.262 user 0m2.466s 00:06:49.262 sys 0m0.313s 00:06:49.262 05:06:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:49.262 05:06:20 -- common/autotest_common.sh@10 -- # set +x 00:06:49.262 ************************************ 00:06:49.262 END TEST accel_fill 00:06:49.262 ************************************ 00:06:49.522 05:06:20 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:49.522 05:06:20 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:49.522 05:06:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:49.522 05:06:20 -- common/autotest_common.sh@10 -- # set +x 00:06:49.522 ************************************ 00:06:49.522 START TEST accel_copy_crc32c 00:06:49.522 ************************************ 00:06:49.522 05:06:20 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:06:49.522 05:06:20 -- accel/accel.sh@16 -- # local accel_opc 00:06:49.522 05:06:20 -- accel/accel.sh@17 -- # local accel_module 00:06:49.522 05:06:20 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:49.522 05:06:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:49.522 05:06:20 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.522 05:06:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.522 05:06:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.522 05:06:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.522 05:06:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.522 05:06:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.522 05:06:20 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.522 05:06:20 -- accel/accel.sh@42 -- # jq -r . 00:06:49.522 [2024-07-23 05:06:20.401141] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:49.522 [2024-07-23 05:06:20.401232] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3129471 ] 00:06:49.522 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.522 [2024-07-23 05:06:20.499709] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.522 [2024-07-23 05:06:20.584630] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.902 05:06:21 -- accel/accel.sh@18 -- # out=' 00:06:50.902 SPDK Configuration: 00:06:50.902 Core mask: 0x1 00:06:50.902 00:06:50.902 Accel Perf Configuration: 00:06:50.902 Workload Type: copy_crc32c 00:06:50.902 CRC-32C seed: 0 00:06:50.902 Vector size: 4096 bytes 00:06:50.902 Transfer size: 4096 bytes 00:06:50.902 Vector count 1 00:06:50.902 Module: software 00:06:50.902 Queue depth: 32 00:06:50.902 Allocate depth: 32 00:06:50.902 # threads/core: 1 00:06:50.902 Run time: 1 seconds 00:06:50.902 Verify: Yes 00:06:50.902 00:06:50.902 Running for 1 seconds... 00:06:50.902 00:06:50.902 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:50.902 ------------------------------------------------------------------------------------ 00:06:50.902 0,0 293120/s 1145 MiB/s 0 0 00:06:50.902 ==================================================================================== 00:06:50.902 Total 293120/s 1145 MiB/s 0 0' 00:06:50.902 05:06:21 -- accel/accel.sh@20 -- # IFS=: 00:06:50.902 05:06:21 -- accel/accel.sh@20 -- # read -r var val 00:06:50.902 05:06:21 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:50.902 05:06:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:50.902 05:06:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.902 05:06:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.902 05:06:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.902 05:06:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.902 05:06:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.902 05:06:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.902 05:06:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.902 05:06:21 -- accel/accel.sh@42 -- # jq -r . 00:06:50.902 [2024-07-23 05:06:21.789424] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:50.902 [2024-07-23 05:06:21.789523] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3129742 ] 00:06:50.902 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.902 [2024-07-23 05:06:21.886774] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.902 [2024-07-23 05:06:21.968164] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val= 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val= 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val=0x1 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val= 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val= 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val=0 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val= 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val=software 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@23 -- # accel_module=software 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val=32 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val=32 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val=1 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val=Yes 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val= 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:51.162 05:06:22 -- accel/accel.sh@21 -- # val= 00:06:51.162 05:06:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # IFS=: 00:06:51.162 05:06:22 -- accel/accel.sh@20 -- # read -r var val 00:06:52.101 05:06:23 -- accel/accel.sh@21 -- # val= 00:06:52.101 05:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:52.101 05:06:23 -- accel/accel.sh@21 -- # val= 00:06:52.101 05:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:52.101 05:06:23 -- accel/accel.sh@21 -- # val= 00:06:52.101 05:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:52.101 05:06:23 -- accel/accel.sh@21 -- # val= 00:06:52.101 05:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:52.101 05:06:23 -- accel/accel.sh@21 -- # val= 00:06:52.101 05:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:52.101 05:06:23 -- accel/accel.sh@21 -- # val= 00:06:52.101 05:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:52.101 05:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:52.101 05:06:23 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:52.101 05:06:23 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:52.101 05:06:23 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:52.101 00:06:52.101 real 0m2.779s 00:06:52.101 user 0m2.468s 00:06:52.101 sys 0m0.318s 00:06:52.101 05:06:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:52.101 05:06:23 -- common/autotest_common.sh@10 -- # set +x 00:06:52.101 ************************************ 00:06:52.101 END TEST accel_copy_crc32c 00:06:52.101 ************************************ 00:06:52.367 05:06:23 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:52.367 05:06:23 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:52.367 05:06:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:52.367 05:06:23 -- common/autotest_common.sh@10 -- # set +x 00:06:52.367 ************************************ 00:06:52.367 START TEST accel_copy_crc32c_C2 00:06:52.367 ************************************ 00:06:52.367 05:06:23 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:52.367 05:06:23 -- accel/accel.sh@16 -- # local accel_opc 00:06:52.367 05:06:23 -- accel/accel.sh@17 -- # local accel_module 00:06:52.367 05:06:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:52.367 05:06:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:52.367 05:06:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.367 05:06:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.367 05:06:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.367 05:06:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.367 05:06:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.367 05:06:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.367 05:06:23 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.367 05:06:23 -- accel/accel.sh@42 -- # jq -r . 00:06:52.367 [2024-07-23 05:06:23.227623] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:52.367 [2024-07-23 05:06:23.227716] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3130031 ] 00:06:52.367 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.367 [2024-07-23 05:06:23.326993] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.367 [2024-07-23 05:06:23.409657] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.753 05:06:24 -- accel/accel.sh@18 -- # out=' 00:06:53.753 SPDK Configuration: 00:06:53.753 Core mask: 0x1 00:06:53.753 00:06:53.753 Accel Perf Configuration: 00:06:53.753 Workload Type: copy_crc32c 00:06:53.753 CRC-32C seed: 0 00:06:53.753 Vector size: 4096 bytes 00:06:53.753 Transfer size: 8192 bytes 00:06:53.753 Vector count 2 00:06:53.753 Module: software 00:06:53.753 Queue depth: 32 00:06:53.753 Allocate depth: 32 00:06:53.753 # threads/core: 1 00:06:53.753 Run time: 1 seconds 00:06:53.753 Verify: Yes 00:06:53.753 00:06:53.753 Running for 1 seconds... 00:06:53.753 00:06:53.753 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:53.753 ------------------------------------------------------------------------------------ 00:06:53.753 0,0 204992/s 1601 MiB/s 0 0 00:06:53.753 ==================================================================================== 00:06:53.753 Total 204992/s 800 MiB/s 0 0' 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:53.753 05:06:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:53.753 05:06:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:53.753 05:06:24 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.753 05:06:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.753 05:06:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.753 05:06:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.753 05:06:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.753 05:06:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.753 05:06:24 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.753 05:06:24 -- accel/accel.sh@42 -- # jq -r . 00:06:53.753 [2024-07-23 05:06:24.613008] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:53.753 [2024-07-23 05:06:24.613099] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3130219 ] 00:06:53.753 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.753 [2024-07-23 05:06:24.710742] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.753 [2024-07-23 05:06:24.792169] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.753 05:06:24 -- accel/accel.sh@21 -- # val= 00:06:53.753 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:53.753 05:06:24 -- accel/accel.sh@21 -- # val= 00:06:53.753 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:53.753 05:06:24 -- accel/accel.sh@21 -- # val=0x1 00:06:53.753 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:53.753 05:06:24 -- accel/accel.sh@21 -- # val= 00:06:53.753 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:53.753 05:06:24 -- accel/accel.sh@21 -- # val= 00:06:53.753 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:53.753 05:06:24 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:53.753 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.753 05:06:24 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:53.753 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:53.754 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:53.754 05:06:24 -- accel/accel.sh@21 -- # val=0 00:06:53.754 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.754 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:53.754 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:53.754 05:06:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:53.754 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.754 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:53.754 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:53.754 05:06:24 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:53.754 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.754 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:53.754 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:53.754 05:06:24 -- accel/accel.sh@21 -- # val= 00:06:53.754 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.754 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:53.754 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:53.754 05:06:24 -- accel/accel.sh@21 -- # val=software 00:06:54.012 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.012 05:06:24 -- accel/accel.sh@23 -- # accel_module=software 00:06:54.012 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.012 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.012 05:06:24 -- accel/accel.sh@21 -- # val=32 00:06:54.012 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.012 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.012 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.013 05:06:24 -- accel/accel.sh@21 -- # val=32 00:06:54.013 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.013 05:06:24 -- accel/accel.sh@21 -- # val=1 00:06:54.013 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.013 05:06:24 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:54.013 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.013 05:06:24 -- accel/accel.sh@21 -- # val=Yes 00:06:54.013 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.013 05:06:24 -- accel/accel.sh@21 -- # val= 00:06:54.013 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.013 05:06:24 -- accel/accel.sh@21 -- # val= 00:06:54.013 05:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:54.013 05:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:54.950 05:06:25 -- accel/accel.sh@21 -- # val= 00:06:54.950 05:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.950 05:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:54.950 05:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:54.950 05:06:25 -- accel/accel.sh@21 -- # val= 00:06:54.950 05:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.950 05:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:54.950 05:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:54.950 05:06:25 -- accel/accel.sh@21 -- # val= 00:06:54.950 05:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.950 05:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:54.951 05:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:54.951 05:06:25 -- accel/accel.sh@21 -- # val= 00:06:54.951 05:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.951 05:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:54.951 05:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:54.951 05:06:25 -- accel/accel.sh@21 -- # val= 00:06:54.951 05:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.951 05:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:54.951 05:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:54.951 05:06:25 -- accel/accel.sh@21 -- # val= 00:06:54.951 05:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.951 05:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:54.951 05:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:54.951 05:06:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:54.951 05:06:25 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:54.951 05:06:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:54.951 00:06:54.951 real 0m2.770s 00:06:54.951 user 0m2.463s 00:06:54.951 sys 0m0.315s 00:06:54.951 05:06:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:54.951 05:06:25 -- common/autotest_common.sh@10 -- # set +x 00:06:54.951 ************************************ 00:06:54.951 END TEST accel_copy_crc32c_C2 00:06:54.951 ************************************ 00:06:54.951 05:06:26 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:54.951 05:06:26 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:54.951 05:06:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:54.951 05:06:26 -- common/autotest_common.sh@10 -- # set +x 00:06:54.951 ************************************ 00:06:54.951 START TEST accel_dualcast 00:06:54.951 ************************************ 00:06:54.951 05:06:26 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:06:54.951 05:06:26 -- accel/accel.sh@16 -- # local accel_opc 00:06:54.951 05:06:26 -- accel/accel.sh@17 -- # local accel_module 00:06:54.951 05:06:26 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:54.951 05:06:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:54.951 05:06:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.951 05:06:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.951 05:06:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.951 05:06:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.951 05:06:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.951 05:06:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.951 05:06:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.951 05:06:26 -- accel/accel.sh@42 -- # jq -r . 00:06:54.951 [2024-07-23 05:06:26.038209] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:54.951 [2024-07-23 05:06:26.038298] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3130450 ] 00:06:55.210 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.210 [2024-07-23 05:06:26.135815] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.210 [2024-07-23 05:06:26.222032] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.588 05:06:27 -- accel/accel.sh@18 -- # out=' 00:06:56.588 SPDK Configuration: 00:06:56.588 Core mask: 0x1 00:06:56.588 00:06:56.588 Accel Perf Configuration: 00:06:56.588 Workload Type: dualcast 00:06:56.588 Transfer size: 4096 bytes 00:06:56.588 Vector count 1 00:06:56.588 Module: software 00:06:56.588 Queue depth: 32 00:06:56.588 Allocate depth: 32 00:06:56.588 # threads/core: 1 00:06:56.588 Run time: 1 seconds 00:06:56.588 Verify: Yes 00:06:56.588 00:06:56.588 Running for 1 seconds... 00:06:56.588 00:06:56.588 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:56.588 ------------------------------------------------------------------------------------ 00:06:56.588 0,0 446592/s 1744 MiB/s 0 0 00:06:56.588 ==================================================================================== 00:06:56.588 Total 446592/s 1744 MiB/s 0 0' 00:06:56.588 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.588 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.588 05:06:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:56.588 05:06:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:56.588 05:06:27 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.588 05:06:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.588 05:06:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.588 05:06:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.588 05:06:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.588 05:06:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.588 05:06:27 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.588 05:06:27 -- accel/accel.sh@42 -- # jq -r . 00:06:56.588 [2024-07-23 05:06:27.424559] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:56.588 [2024-07-23 05:06:27.424654] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3130639 ] 00:06:56.588 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.588 [2024-07-23 05:06:27.522365] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.588 [2024-07-23 05:06:27.605418] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.588 05:06:27 -- accel/accel.sh@21 -- # val= 00:06:56.588 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.588 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.588 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.588 05:06:27 -- accel/accel.sh@21 -- # val= 00:06:56.588 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.588 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.588 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.588 05:06:27 -- accel/accel.sh@21 -- # val=0x1 00:06:56.588 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.588 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.588 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.588 05:06:27 -- accel/accel.sh@21 -- # val= 00:06:56.588 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.588 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.588 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.588 05:06:27 -- accel/accel.sh@21 -- # val= 00:06:56.588 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.588 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.589 05:06:27 -- accel/accel.sh@21 -- # val=dualcast 00:06:56.589 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.589 05:06:27 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.589 05:06:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:56.589 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.589 05:06:27 -- accel/accel.sh@21 -- # val= 00:06:56.589 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.589 05:06:27 -- accel/accel.sh@21 -- # val=software 00:06:56.589 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.589 05:06:27 -- accel/accel.sh@23 -- # accel_module=software 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.589 05:06:27 -- accel/accel.sh@21 -- # val=32 00:06:56.589 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.589 05:06:27 -- accel/accel.sh@21 -- # val=32 00:06:56.589 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.589 05:06:27 -- accel/accel.sh@21 -- # val=1 00:06:56.589 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.589 05:06:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:56.589 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.589 05:06:27 -- accel/accel.sh@21 -- # val=Yes 00:06:56.589 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.589 05:06:27 -- accel/accel.sh@21 -- # val= 00:06:56.589 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:56.589 05:06:27 -- accel/accel.sh@21 -- # val= 00:06:56.589 05:06:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # IFS=: 00:06:56.589 05:06:27 -- accel/accel.sh@20 -- # read -r var val 00:06:57.969 05:06:28 -- accel/accel.sh@21 -- # val= 00:06:57.969 05:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:57.969 05:06:28 -- accel/accel.sh@21 -- # val= 00:06:57.969 05:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:57.969 05:06:28 -- accel/accel.sh@21 -- # val= 00:06:57.969 05:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:57.969 05:06:28 -- accel/accel.sh@21 -- # val= 00:06:57.969 05:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:57.969 05:06:28 -- accel/accel.sh@21 -- # val= 00:06:57.969 05:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:57.969 05:06:28 -- accel/accel.sh@21 -- # val= 00:06:57.969 05:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:57.969 05:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:57.969 05:06:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:57.969 05:06:28 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:57.969 05:06:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:57.969 00:06:57.969 real 0m2.771s 00:06:57.969 user 0m2.462s 00:06:57.969 sys 0m0.305s 00:06:57.969 05:06:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:57.969 05:06:28 -- common/autotest_common.sh@10 -- # set +x 00:06:57.969 ************************************ 00:06:57.969 END TEST accel_dualcast 00:06:57.969 ************************************ 00:06:57.969 05:06:28 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:57.969 05:06:28 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:57.969 05:06:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:57.969 05:06:28 -- common/autotest_common.sh@10 -- # set +x 00:06:57.969 ************************************ 00:06:57.969 START TEST accel_compare 00:06:57.969 ************************************ 00:06:57.969 05:06:28 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:06:57.969 05:06:28 -- accel/accel.sh@16 -- # local accel_opc 00:06:57.969 05:06:28 -- accel/accel.sh@17 -- # local accel_module 00:06:57.969 05:06:28 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:57.969 05:06:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:57.969 05:06:28 -- accel/accel.sh@12 -- # build_accel_config 00:06:57.969 05:06:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:57.969 05:06:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.969 05:06:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.969 05:06:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:57.969 05:06:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:57.969 05:06:28 -- accel/accel.sh@41 -- # local IFS=, 00:06:57.969 05:06:28 -- accel/accel.sh@42 -- # jq -r . 00:06:57.969 [2024-07-23 05:06:28.849523] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:57.969 [2024-07-23 05:06:28.849613] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3130898 ] 00:06:57.969 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.969 [2024-07-23 05:06:28.947330] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.969 [2024-07-23 05:06:29.029694] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.349 05:06:30 -- accel/accel.sh@18 -- # out=' 00:06:59.349 SPDK Configuration: 00:06:59.349 Core mask: 0x1 00:06:59.349 00:06:59.349 Accel Perf Configuration: 00:06:59.349 Workload Type: compare 00:06:59.349 Transfer size: 4096 bytes 00:06:59.349 Vector count 1 00:06:59.349 Module: software 00:06:59.349 Queue depth: 32 00:06:59.349 Allocate depth: 32 00:06:59.349 # threads/core: 1 00:06:59.349 Run time: 1 seconds 00:06:59.349 Verify: Yes 00:06:59.349 00:06:59.349 Running for 1 seconds... 00:06:59.349 00:06:59.349 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:59.349 ------------------------------------------------------------------------------------ 00:06:59.349 0,0 548192/s 2141 MiB/s 0 0 00:06:59.349 ==================================================================================== 00:06:59.349 Total 548192/s 2141 MiB/s 0 0' 00:06:59.349 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.349 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.349 05:06:30 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:59.349 05:06:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:59.349 05:06:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.349 05:06:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.349 05:06:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.349 05:06:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.349 05:06:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.349 05:06:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.349 05:06:30 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.349 05:06:30 -- accel/accel.sh@42 -- # jq -r . 00:06:59.349 [2024-07-23 05:06:30.234858] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:06:59.349 [2024-07-23 05:06:30.234991] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3131170 ] 00:06:59.349 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.349 [2024-07-23 05:06:30.389915] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.608 [2024-07-23 05:06:30.477707] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.608 05:06:30 -- accel/accel.sh@21 -- # val= 00:06:59.608 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.608 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.608 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.608 05:06:30 -- accel/accel.sh@21 -- # val= 00:06:59.608 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.608 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.608 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.608 05:06:30 -- accel/accel.sh@21 -- # val=0x1 00:06:59.608 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.608 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.608 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.608 05:06:30 -- accel/accel.sh@21 -- # val= 00:06:59.608 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.608 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.608 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.608 05:06:30 -- accel/accel.sh@21 -- # val= 00:06:59.608 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.608 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.608 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.608 05:06:30 -- accel/accel.sh@21 -- # val=compare 00:06:59.608 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.608 05:06:30 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.609 05:06:30 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:59.609 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.609 05:06:30 -- accel/accel.sh@21 -- # val= 00:06:59.609 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.609 05:06:30 -- accel/accel.sh@21 -- # val=software 00:06:59.609 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.609 05:06:30 -- accel/accel.sh@23 -- # accel_module=software 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.609 05:06:30 -- accel/accel.sh@21 -- # val=32 00:06:59.609 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.609 05:06:30 -- accel/accel.sh@21 -- # val=32 00:06:59.609 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.609 05:06:30 -- accel/accel.sh@21 -- # val=1 00:06:59.609 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.609 05:06:30 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:59.609 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.609 05:06:30 -- accel/accel.sh@21 -- # val=Yes 00:06:59.609 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.609 05:06:30 -- accel/accel.sh@21 -- # val= 00:06:59.609 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:59.609 05:06:30 -- accel/accel.sh@21 -- # val= 00:06:59.609 05:06:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:59.609 05:06:30 -- accel/accel.sh@20 -- # read -r var val 00:07:00.988 05:06:31 -- accel/accel.sh@21 -- # val= 00:07:00.988 05:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # IFS=: 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # read -r var val 00:07:00.988 05:06:31 -- accel/accel.sh@21 -- # val= 00:07:00.988 05:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # IFS=: 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # read -r var val 00:07:00.988 05:06:31 -- accel/accel.sh@21 -- # val= 00:07:00.988 05:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # IFS=: 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # read -r var val 00:07:00.988 05:06:31 -- accel/accel.sh@21 -- # val= 00:07:00.988 05:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # IFS=: 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # read -r var val 00:07:00.988 05:06:31 -- accel/accel.sh@21 -- # val= 00:07:00.988 05:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # IFS=: 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # read -r var val 00:07:00.988 05:06:31 -- accel/accel.sh@21 -- # val= 00:07:00.988 05:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # IFS=: 00:07:00.988 05:06:31 -- accel/accel.sh@20 -- # read -r var val 00:07:00.988 05:06:31 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:00.988 05:06:31 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:07:00.988 05:06:31 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:00.988 00:07:00.988 real 0m2.833s 00:07:00.988 user 0m2.475s 00:07:00.988 sys 0m0.353s 00:07:00.988 05:06:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:00.988 05:06:31 -- common/autotest_common.sh@10 -- # set +x 00:07:00.988 ************************************ 00:07:00.988 END TEST accel_compare 00:07:00.988 ************************************ 00:07:00.989 05:06:31 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:00.989 05:06:31 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:07:00.989 05:06:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:00.989 05:06:31 -- common/autotest_common.sh@10 -- # set +x 00:07:00.989 ************************************ 00:07:00.989 START TEST accel_xor 00:07:00.989 ************************************ 00:07:00.989 05:06:31 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:07:00.989 05:06:31 -- accel/accel.sh@16 -- # local accel_opc 00:07:00.989 05:06:31 -- accel/accel.sh@17 -- # local accel_module 00:07:00.989 05:06:31 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:07:00.989 05:06:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:00.989 05:06:31 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.989 05:06:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.989 05:06:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.989 05:06:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.989 05:06:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.989 05:06:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.989 05:06:31 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.989 05:06:31 -- accel/accel.sh@42 -- # jq -r . 00:07:00.989 [2024-07-23 05:06:31.716163] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:00.989 [2024-07-23 05:06:31.716253] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3131454 ] 00:07:00.989 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.989 [2024-07-23 05:06:31.814623] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.989 [2024-07-23 05:06:31.897015] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.367 05:06:33 -- accel/accel.sh@18 -- # out=' 00:07:02.367 SPDK Configuration: 00:07:02.367 Core mask: 0x1 00:07:02.367 00:07:02.367 Accel Perf Configuration: 00:07:02.367 Workload Type: xor 00:07:02.367 Source buffers: 2 00:07:02.367 Transfer size: 4096 bytes 00:07:02.367 Vector count 1 00:07:02.367 Module: software 00:07:02.367 Queue depth: 32 00:07:02.367 Allocate depth: 32 00:07:02.367 # threads/core: 1 00:07:02.367 Run time: 1 seconds 00:07:02.367 Verify: Yes 00:07:02.367 00:07:02.367 Running for 1 seconds... 00:07:02.367 00:07:02.367 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:02.367 ------------------------------------------------------------------------------------ 00:07:02.367 0,0 509344/s 1989 MiB/s 0 0 00:07:02.367 ==================================================================================== 00:07:02.367 Total 509344/s 1989 MiB/s 0 0' 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:02.367 05:06:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:02.367 05:06:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.367 05:06:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.367 05:06:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.367 05:06:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.367 05:06:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.367 05:06:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.367 05:06:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.367 05:06:33 -- accel/accel.sh@42 -- # jq -r . 00:07:02.367 [2024-07-23 05:06:33.088310] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:02.367 [2024-07-23 05:06:33.088364] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3131721 ] 00:07:02.367 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.367 [2024-07-23 05:06:33.171582] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.367 [2024-07-23 05:06:33.253277] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val= 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val= 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val=0x1 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val= 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val= 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val=xor 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val=2 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val= 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val=software 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@23 -- # accel_module=software 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val=32 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val=32 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val=1 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val=Yes 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val= 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:02.367 05:06:33 -- accel/accel.sh@21 -- # val= 00:07:02.367 05:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # IFS=: 00:07:02.367 05:06:33 -- accel/accel.sh@20 -- # read -r var val 00:07:03.743 05:06:34 -- accel/accel.sh@21 -- # val= 00:07:03.743 05:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # IFS=: 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # read -r var val 00:07:03.743 05:06:34 -- accel/accel.sh@21 -- # val= 00:07:03.743 05:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # IFS=: 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # read -r var val 00:07:03.743 05:06:34 -- accel/accel.sh@21 -- # val= 00:07:03.743 05:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # IFS=: 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # read -r var val 00:07:03.743 05:06:34 -- accel/accel.sh@21 -- # val= 00:07:03.743 05:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # IFS=: 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # read -r var val 00:07:03.743 05:06:34 -- accel/accel.sh@21 -- # val= 00:07:03.743 05:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # IFS=: 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # read -r var val 00:07:03.743 05:06:34 -- accel/accel.sh@21 -- # val= 00:07:03.743 05:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # IFS=: 00:07:03.743 05:06:34 -- accel/accel.sh@20 -- # read -r var val 00:07:03.743 05:06:34 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:03.743 05:06:34 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:03.743 05:06:34 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:03.743 00:07:03.743 real 0m2.740s 00:07:03.743 user 0m2.445s 00:07:03.743 sys 0m0.292s 00:07:03.743 05:06:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:03.743 05:06:34 -- common/autotest_common.sh@10 -- # set +x 00:07:03.743 ************************************ 00:07:03.743 END TEST accel_xor 00:07:03.743 ************************************ 00:07:03.743 05:06:34 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:03.743 05:06:34 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:03.743 05:06:34 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:03.743 05:06:34 -- common/autotest_common.sh@10 -- # set +x 00:07:03.743 ************************************ 00:07:03.743 START TEST accel_xor 00:07:03.743 ************************************ 00:07:03.743 05:06:34 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:07:03.743 05:06:34 -- accel/accel.sh@16 -- # local accel_opc 00:07:03.743 05:06:34 -- accel/accel.sh@17 -- # local accel_module 00:07:03.743 05:06:34 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:07:03.743 05:06:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:03.743 05:06:34 -- accel/accel.sh@12 -- # build_accel_config 00:07:03.743 05:06:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.743 05:06:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.743 05:06:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.743 05:06:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.743 05:06:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.743 05:06:34 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.743 05:06:34 -- accel/accel.sh@42 -- # jq -r . 00:07:03.743 [2024-07-23 05:06:34.494149] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:03.743 [2024-07-23 05:06:34.494240] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3132013 ] 00:07:03.743 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.743 [2024-07-23 05:06:34.590418] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.743 [2024-07-23 05:06:34.672772] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.121 05:06:35 -- accel/accel.sh@18 -- # out=' 00:07:05.121 SPDK Configuration: 00:07:05.121 Core mask: 0x1 00:07:05.121 00:07:05.121 Accel Perf Configuration: 00:07:05.121 Workload Type: xor 00:07:05.121 Source buffers: 3 00:07:05.121 Transfer size: 4096 bytes 00:07:05.121 Vector count 1 00:07:05.121 Module: software 00:07:05.121 Queue depth: 32 00:07:05.121 Allocate depth: 32 00:07:05.121 # threads/core: 1 00:07:05.121 Run time: 1 seconds 00:07:05.121 Verify: Yes 00:07:05.121 00:07:05.121 Running for 1 seconds... 00:07:05.121 00:07:05.121 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:05.121 ------------------------------------------------------------------------------------ 00:07:05.121 0,0 478240/s 1868 MiB/s 0 0 00:07:05.121 ==================================================================================== 00:07:05.121 Total 478240/s 1868 MiB/s 0 0' 00:07:05.121 05:06:35 -- accel/accel.sh@20 -- # IFS=: 00:07:05.121 05:06:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:05.121 05:06:35 -- accel/accel.sh@20 -- # read -r var val 00:07:05.121 05:06:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:05.121 05:06:35 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.121 05:06:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:05.121 05:06:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.121 05:06:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.121 05:06:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:05.121 05:06:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:05.121 05:06:35 -- accel/accel.sh@41 -- # local IFS=, 00:07:05.121 05:06:35 -- accel/accel.sh@42 -- # jq -r . 00:07:05.121 [2024-07-23 05:06:35.863894] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:05.121 [2024-07-23 05:06:35.863946] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3132283 ] 00:07:05.121 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.121 [2024-07-23 05:06:35.949098] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.121 [2024-07-23 05:06:36.030423] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.121 05:06:36 -- accel/accel.sh@21 -- # val= 00:07:05.121 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.121 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.121 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.121 05:06:36 -- accel/accel.sh@21 -- # val= 00:07:05.121 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.121 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.121 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.121 05:06:36 -- accel/accel.sh@21 -- # val=0x1 00:07:05.121 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.121 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.121 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.121 05:06:36 -- accel/accel.sh@21 -- # val= 00:07:05.121 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.121 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.121 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.121 05:06:36 -- accel/accel.sh@21 -- # val= 00:07:05.121 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.121 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val=xor 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val=3 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val= 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val=software 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@23 -- # accel_module=software 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val=32 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val=32 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val=1 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val=Yes 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val= 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:05.122 05:06:36 -- accel/accel.sh@21 -- # val= 00:07:05.122 05:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # IFS=: 00:07:05.122 05:06:36 -- accel/accel.sh@20 -- # read -r var val 00:07:06.500 05:06:37 -- accel/accel.sh@21 -- # val= 00:07:06.500 05:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # IFS=: 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # read -r var val 00:07:06.500 05:06:37 -- accel/accel.sh@21 -- # val= 00:07:06.500 05:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # IFS=: 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # read -r var val 00:07:06.500 05:06:37 -- accel/accel.sh@21 -- # val= 00:07:06.500 05:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # IFS=: 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # read -r var val 00:07:06.500 05:06:37 -- accel/accel.sh@21 -- # val= 00:07:06.500 05:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # IFS=: 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # read -r var val 00:07:06.500 05:06:37 -- accel/accel.sh@21 -- # val= 00:07:06.500 05:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # IFS=: 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # read -r var val 00:07:06.500 05:06:37 -- accel/accel.sh@21 -- # val= 00:07:06.500 05:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # IFS=: 00:07:06.500 05:06:37 -- accel/accel.sh@20 -- # read -r var val 00:07:06.500 05:06:37 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:06.500 05:06:37 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:06.500 05:06:37 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:06.500 00:07:06.500 real 0m2.741s 00:07:06.500 user 0m2.443s 00:07:06.500 sys 0m0.294s 00:07:06.500 05:06:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:06.500 05:06:37 -- common/autotest_common.sh@10 -- # set +x 00:07:06.500 ************************************ 00:07:06.500 END TEST accel_xor 00:07:06.500 ************************************ 00:07:06.500 05:06:37 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:06.500 05:06:37 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:06.500 05:06:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:06.500 05:06:37 -- common/autotest_common.sh@10 -- # set +x 00:07:06.500 ************************************ 00:07:06.500 START TEST accel_dif_verify 00:07:06.500 ************************************ 00:07:06.500 05:06:37 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:07:06.500 05:06:37 -- accel/accel.sh@16 -- # local accel_opc 00:07:06.500 05:06:37 -- accel/accel.sh@17 -- # local accel_module 00:07:06.500 05:06:37 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:07:06.500 05:06:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:06.500 05:06:37 -- accel/accel.sh@12 -- # build_accel_config 00:07:06.500 05:06:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:06.500 05:06:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.500 05:06:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.500 05:06:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:06.500 05:06:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:06.500 05:06:37 -- accel/accel.sh@41 -- # local IFS=, 00:07:06.500 05:06:37 -- accel/accel.sh@42 -- # jq -r . 00:07:06.500 [2024-07-23 05:06:37.279682] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:06.500 [2024-07-23 05:06:37.279771] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3132570 ] 00:07:06.500 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.500 [2024-07-23 05:06:37.377631] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.500 [2024-07-23 05:06:37.456802] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.878 05:06:38 -- accel/accel.sh@18 -- # out=' 00:07:07.878 SPDK Configuration: 00:07:07.878 Core mask: 0x1 00:07:07.878 00:07:07.878 Accel Perf Configuration: 00:07:07.878 Workload Type: dif_verify 00:07:07.878 Vector size: 4096 bytes 00:07:07.878 Transfer size: 4096 bytes 00:07:07.878 Block size: 512 bytes 00:07:07.878 Metadata size: 8 bytes 00:07:07.878 Vector count 1 00:07:07.878 Module: software 00:07:07.878 Queue depth: 32 00:07:07.878 Allocate depth: 32 00:07:07.878 # threads/core: 1 00:07:07.878 Run time: 1 seconds 00:07:07.878 Verify: No 00:07:07.878 00:07:07.878 Running for 1 seconds... 00:07:07.878 00:07:07.878 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:07.878 ------------------------------------------------------------------------------------ 00:07:07.878 0,0 160000/s 634 MiB/s 0 0 00:07:07.878 ==================================================================================== 00:07:07.878 Total 160000/s 625 MiB/s 0 0' 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:07.878 05:06:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:07.878 05:06:38 -- accel/accel.sh@12 -- # build_accel_config 00:07:07.878 05:06:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:07.878 05:06:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.878 05:06:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.878 05:06:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:07.878 05:06:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:07.878 05:06:38 -- accel/accel.sh@41 -- # local IFS=, 00:07:07.878 05:06:38 -- accel/accel.sh@42 -- # jq -r . 00:07:07.878 [2024-07-23 05:06:38.660200] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:07.878 [2024-07-23 05:06:38.660290] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3132826 ] 00:07:07.878 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.878 [2024-07-23 05:06:38.758534] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.878 [2024-07-23 05:06:38.839768] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val= 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val= 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val=0x1 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val= 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val= 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val=dif_verify 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val= 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val=software 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@23 -- # accel_module=software 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val=32 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val=32 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val=1 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val=No 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val= 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:07.878 05:06:38 -- accel/accel.sh@21 -- # val= 00:07:07.878 05:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # IFS=: 00:07:07.878 05:06:38 -- accel/accel.sh@20 -- # read -r var val 00:07:09.255 05:06:40 -- accel/accel.sh@21 -- # val= 00:07:09.255 05:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.255 05:06:40 -- accel/accel.sh@20 -- # IFS=: 00:07:09.255 05:06:40 -- accel/accel.sh@20 -- # read -r var val 00:07:09.255 05:06:40 -- accel/accel.sh@21 -- # val= 00:07:09.255 05:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.255 05:06:40 -- accel/accel.sh@20 -- # IFS=: 00:07:09.255 05:06:40 -- accel/accel.sh@20 -- # read -r var val 00:07:09.255 05:06:40 -- accel/accel.sh@21 -- # val= 00:07:09.255 05:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.255 05:06:40 -- accel/accel.sh@20 -- # IFS=: 00:07:09.255 05:06:40 -- accel/accel.sh@20 -- # read -r var val 00:07:09.255 05:06:40 -- accel/accel.sh@21 -- # val= 00:07:09.255 05:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.255 05:06:40 -- accel/accel.sh@20 -- # IFS=: 00:07:09.256 05:06:40 -- accel/accel.sh@20 -- # read -r var val 00:07:09.256 05:06:40 -- accel/accel.sh@21 -- # val= 00:07:09.256 05:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.256 05:06:40 -- accel/accel.sh@20 -- # IFS=: 00:07:09.256 05:06:40 -- accel/accel.sh@20 -- # read -r var val 00:07:09.256 05:06:40 -- accel/accel.sh@21 -- # val= 00:07:09.256 05:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.256 05:06:40 -- accel/accel.sh@20 -- # IFS=: 00:07:09.256 05:06:40 -- accel/accel.sh@20 -- # read -r var val 00:07:09.256 05:06:40 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:09.256 05:06:40 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:09.256 05:06:40 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:09.256 00:07:09.256 real 0m2.769s 00:07:09.256 user 0m2.475s 00:07:09.256 sys 0m0.291s 00:07:09.256 05:06:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.256 05:06:40 -- common/autotest_common.sh@10 -- # set +x 00:07:09.256 ************************************ 00:07:09.256 END TEST accel_dif_verify 00:07:09.256 ************************************ 00:07:09.256 05:06:40 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:09.256 05:06:40 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:09.256 05:06:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:09.256 05:06:40 -- common/autotest_common.sh@10 -- # set +x 00:07:09.256 ************************************ 00:07:09.256 START TEST accel_dif_generate 00:07:09.256 ************************************ 00:07:09.256 05:06:40 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:07:09.256 05:06:40 -- accel/accel.sh@16 -- # local accel_opc 00:07:09.256 05:06:40 -- accel/accel.sh@17 -- # local accel_module 00:07:09.256 05:06:40 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:09.256 05:06:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:09.256 05:06:40 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.256 05:06:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.256 05:06:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.256 05:06:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.256 05:06:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.256 05:06:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.256 05:06:40 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.256 05:06:40 -- accel/accel.sh@42 -- # jq -r . 00:07:09.256 [2024-07-23 05:06:40.094693] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:09.256 [2024-07-23 05:06:40.094801] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3133050 ] 00:07:09.256 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.256 [2024-07-23 05:06:40.194405] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.256 [2024-07-23 05:06:40.279399] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.669 05:06:41 -- accel/accel.sh@18 -- # out=' 00:07:10.669 SPDK Configuration: 00:07:10.669 Core mask: 0x1 00:07:10.669 00:07:10.669 Accel Perf Configuration: 00:07:10.669 Workload Type: dif_generate 00:07:10.669 Vector size: 4096 bytes 00:07:10.669 Transfer size: 4096 bytes 00:07:10.669 Block size: 512 bytes 00:07:10.669 Metadata size: 8 bytes 00:07:10.669 Vector count 1 00:07:10.669 Module: software 00:07:10.669 Queue depth: 32 00:07:10.669 Allocate depth: 32 00:07:10.669 # threads/core: 1 00:07:10.669 Run time: 1 seconds 00:07:10.669 Verify: No 00:07:10.669 00:07:10.669 Running for 1 seconds... 00:07:10.669 00:07:10.669 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:10.669 ------------------------------------------------------------------------------------ 00:07:10.669 0,0 194912/s 773 MiB/s 0 0 00:07:10.669 ==================================================================================== 00:07:10.669 Total 194912/s 761 MiB/s 0 0' 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:10.669 05:06:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:10.669 05:06:41 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.669 05:06:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:10.669 05:06:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.669 05:06:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.669 05:06:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:10.669 05:06:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:10.669 05:06:41 -- accel/accel.sh@41 -- # local IFS=, 00:07:10.669 05:06:41 -- accel/accel.sh@42 -- # jq -r . 00:07:10.669 [2024-07-23 05:06:41.471495] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:10.669 [2024-07-23 05:06:41.471548] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3133256 ] 00:07:10.669 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.669 [2024-07-23 05:06:41.553279] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.669 [2024-07-23 05:06:41.635384] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val= 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val= 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val=0x1 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val= 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val= 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val=dif_generate 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val= 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val=software 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@23 -- # accel_module=software 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val=32 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val=32 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val=1 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val=No 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val= 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:10.669 05:06:41 -- accel/accel.sh@21 -- # val= 00:07:10.669 05:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # IFS=: 00:07:10.669 05:06:41 -- accel/accel.sh@20 -- # read -r var val 00:07:12.047 05:06:42 -- accel/accel.sh@21 -- # val= 00:07:12.047 05:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # IFS=: 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # read -r var val 00:07:12.047 05:06:42 -- accel/accel.sh@21 -- # val= 00:07:12.047 05:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # IFS=: 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # read -r var val 00:07:12.047 05:06:42 -- accel/accel.sh@21 -- # val= 00:07:12.047 05:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # IFS=: 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # read -r var val 00:07:12.047 05:06:42 -- accel/accel.sh@21 -- # val= 00:07:12.047 05:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # IFS=: 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # read -r var val 00:07:12.047 05:06:42 -- accel/accel.sh@21 -- # val= 00:07:12.047 05:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # IFS=: 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # read -r var val 00:07:12.047 05:06:42 -- accel/accel.sh@21 -- # val= 00:07:12.047 05:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # IFS=: 00:07:12.047 05:06:42 -- accel/accel.sh@20 -- # read -r var val 00:07:12.047 05:06:42 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:12.047 05:06:42 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:12.047 05:06:42 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.047 00:07:12.047 real 0m2.746s 00:07:12.047 user 0m2.453s 00:07:12.047 sys 0m0.292s 00:07:12.047 05:06:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.047 05:06:42 -- common/autotest_common.sh@10 -- # set +x 00:07:12.047 ************************************ 00:07:12.047 END TEST accel_dif_generate 00:07:12.047 ************************************ 00:07:12.047 05:06:42 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:12.047 05:06:42 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:12.047 05:06:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:12.047 05:06:42 -- common/autotest_common.sh@10 -- # set +x 00:07:12.047 ************************************ 00:07:12.047 START TEST accel_dif_generate_copy 00:07:12.047 ************************************ 00:07:12.047 05:06:42 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:07:12.047 05:06:42 -- accel/accel.sh@16 -- # local accel_opc 00:07:12.047 05:06:42 -- accel/accel.sh@17 -- # local accel_module 00:07:12.047 05:06:42 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:12.047 05:06:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:12.047 05:06:42 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.047 05:06:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.047 05:06:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.047 05:06:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.047 05:06:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.047 05:06:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.047 05:06:42 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.047 05:06:42 -- accel/accel.sh@42 -- # jq -r . 00:07:12.047 [2024-07-23 05:06:42.877643] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:12.048 [2024-07-23 05:06:42.877731] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3133459 ] 00:07:12.048 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.048 [2024-07-23 05:06:42.975835] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.048 [2024-07-23 05:06:43.058309] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.427 05:06:44 -- accel/accel.sh@18 -- # out=' 00:07:13.428 SPDK Configuration: 00:07:13.428 Core mask: 0x1 00:07:13.428 00:07:13.428 Accel Perf Configuration: 00:07:13.428 Workload Type: dif_generate_copy 00:07:13.428 Vector size: 4096 bytes 00:07:13.428 Transfer size: 4096 bytes 00:07:13.428 Vector count 1 00:07:13.428 Module: software 00:07:13.428 Queue depth: 32 00:07:13.428 Allocate depth: 32 00:07:13.428 # threads/core: 1 00:07:13.428 Run time: 1 seconds 00:07:13.428 Verify: No 00:07:13.428 00:07:13.428 Running for 1 seconds... 00:07:13.428 00:07:13.428 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:13.428 ------------------------------------------------------------------------------------ 00:07:13.428 0,0 149696/s 593 MiB/s 0 0 00:07:13.428 ==================================================================================== 00:07:13.428 Total 149696/s 584 MiB/s 0 0' 00:07:13.428 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.428 05:06:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:13.428 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.428 05:06:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:13.428 05:06:44 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.428 05:06:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:13.428 05:06:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.428 05:06:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.428 05:06:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:13.428 05:06:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:13.428 05:06:44 -- accel/accel.sh@41 -- # local IFS=, 00:07:13.428 05:06:44 -- accel/accel.sh@42 -- # jq -r . 00:07:13.428 [2024-07-23 05:06:44.263286] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:13.428 [2024-07-23 05:06:44.263422] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3133702 ] 00:07:13.428 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.428 [2024-07-23 05:06:44.420217] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.428 [2024-07-23 05:06:44.513293] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val= 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val= 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val=0x1 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val= 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val= 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.687 05:06:44 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val= 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val=software 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.687 05:06:44 -- accel/accel.sh@23 -- # accel_module=software 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val=32 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.687 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.687 05:06:44 -- accel/accel.sh@21 -- # val=32 00:07:13.687 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.688 05:06:44 -- accel/accel.sh@21 -- # val=1 00:07:13.688 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.688 05:06:44 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:13.688 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.688 05:06:44 -- accel/accel.sh@21 -- # val=No 00:07:13.688 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.688 05:06:44 -- accel/accel.sh@21 -- # val= 00:07:13.688 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:13.688 05:06:44 -- accel/accel.sh@21 -- # val= 00:07:13.688 05:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # IFS=: 00:07:13.688 05:06:44 -- accel/accel.sh@20 -- # read -r var val 00:07:14.626 05:06:45 -- accel/accel.sh@21 -- # val= 00:07:14.626 05:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # IFS=: 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # read -r var val 00:07:14.626 05:06:45 -- accel/accel.sh@21 -- # val= 00:07:14.626 05:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # IFS=: 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # read -r var val 00:07:14.626 05:06:45 -- accel/accel.sh@21 -- # val= 00:07:14.626 05:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # IFS=: 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # read -r var val 00:07:14.626 05:06:45 -- accel/accel.sh@21 -- # val= 00:07:14.626 05:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # IFS=: 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # read -r var val 00:07:14.626 05:06:45 -- accel/accel.sh@21 -- # val= 00:07:14.626 05:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # IFS=: 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # read -r var val 00:07:14.626 05:06:45 -- accel/accel.sh@21 -- # val= 00:07:14.626 05:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # IFS=: 00:07:14.626 05:06:45 -- accel/accel.sh@20 -- # read -r var val 00:07:14.626 05:06:45 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:14.626 05:06:45 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:14.626 05:06:45 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.626 00:07:14.626 real 0m2.840s 00:07:14.626 user 0m2.470s 00:07:14.626 sys 0m0.365s 00:07:14.626 05:06:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.626 05:06:45 -- common/autotest_common.sh@10 -- # set +x 00:07:14.626 ************************************ 00:07:14.626 END TEST accel_dif_generate_copy 00:07:14.626 ************************************ 00:07:14.885 05:06:45 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:14.885 05:06:45 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:14.885 05:06:45 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:07:14.885 05:06:45 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:14.885 05:06:45 -- common/autotest_common.sh@10 -- # set +x 00:07:14.885 ************************************ 00:07:14.885 START TEST accel_comp 00:07:14.885 ************************************ 00:07:14.885 05:06:45 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:14.885 05:06:45 -- accel/accel.sh@16 -- # local accel_opc 00:07:14.885 05:06:45 -- accel/accel.sh@17 -- # local accel_module 00:07:14.885 05:06:45 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:14.885 05:06:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:14.885 05:06:45 -- accel/accel.sh@12 -- # build_accel_config 00:07:14.885 05:06:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:14.885 05:06:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.885 05:06:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.885 05:06:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:14.885 05:06:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:14.885 05:06:45 -- accel/accel.sh@41 -- # local IFS=, 00:07:14.885 05:06:45 -- accel/accel.sh@42 -- # jq -r . 00:07:14.885 [2024-07-23 05:06:45.756694] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:14.885 [2024-07-23 05:06:45.756794] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3133989 ] 00:07:14.885 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.885 [2024-07-23 05:06:45.855731] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.885 [2024-07-23 05:06:45.941452] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.264 05:06:47 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:16.264 00:07:16.264 SPDK Configuration: 00:07:16.264 Core mask: 0x1 00:07:16.264 00:07:16.264 Accel Perf Configuration: 00:07:16.264 Workload Type: compress 00:07:16.264 Transfer size: 4096 bytes 00:07:16.264 Vector count 1 00:07:16.264 Module: software 00:07:16.264 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.264 Queue depth: 32 00:07:16.264 Allocate depth: 32 00:07:16.264 # threads/core: 1 00:07:16.264 Run time: 1 seconds 00:07:16.264 Verify: No 00:07:16.264 00:07:16.264 Running for 1 seconds... 00:07:16.264 00:07:16.264 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:16.264 ------------------------------------------------------------------------------------ 00:07:16.264 0,0 48256/s 201 MiB/s 0 0 00:07:16.264 ==================================================================================== 00:07:16.264 Total 48256/s 188 MiB/s 0 0' 00:07:16.264 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.264 05:06:47 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.264 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.264 05:06:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.264 05:06:47 -- accel/accel.sh@12 -- # build_accel_config 00:07:16.264 05:06:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:16.264 05:06:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.264 05:06:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.264 05:06:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:16.264 05:06:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:16.264 05:06:47 -- accel/accel.sh@41 -- # local IFS=, 00:07:16.264 05:06:47 -- accel/accel.sh@42 -- # jq -r . 00:07:16.264 [2024-07-23 05:06:47.149118] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:16.264 [2024-07-23 05:06:47.149253] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3134262 ] 00:07:16.264 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.264 [2024-07-23 05:06:47.306304] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.524 [2024-07-23 05:06:47.394473] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val= 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val= 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val= 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val=0x1 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val= 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val= 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val=compress 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val= 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val=software 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@23 -- # accel_module=software 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val=32 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val=32 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val=1 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val=No 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val= 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:16.524 05:06:47 -- accel/accel.sh@21 -- # val= 00:07:16.524 05:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:16.524 05:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:17.904 05:06:48 -- accel/accel.sh@21 -- # val= 00:07:17.904 05:06:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # IFS=: 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # read -r var val 00:07:17.904 05:06:48 -- accel/accel.sh@21 -- # val= 00:07:17.904 05:06:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # IFS=: 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # read -r var val 00:07:17.904 05:06:48 -- accel/accel.sh@21 -- # val= 00:07:17.904 05:06:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # IFS=: 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # read -r var val 00:07:17.904 05:06:48 -- accel/accel.sh@21 -- # val= 00:07:17.904 05:06:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # IFS=: 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # read -r var val 00:07:17.904 05:06:48 -- accel/accel.sh@21 -- # val= 00:07:17.904 05:06:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # IFS=: 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # read -r var val 00:07:17.904 05:06:48 -- accel/accel.sh@21 -- # val= 00:07:17.904 05:06:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # IFS=: 00:07:17.904 05:06:48 -- accel/accel.sh@20 -- # read -r var val 00:07:17.904 05:06:48 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:17.904 05:06:48 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:17.904 05:06:48 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.904 00:07:17.904 real 0m2.846s 00:07:17.904 user 0m2.493s 00:07:17.904 sys 0m0.349s 00:07:17.904 05:06:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.904 05:06:48 -- common/autotest_common.sh@10 -- # set +x 00:07:17.904 ************************************ 00:07:17.904 END TEST accel_comp 00:07:17.904 ************************************ 00:07:17.904 05:06:48 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:17.904 05:06:48 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:17.904 05:06:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:17.904 05:06:48 -- common/autotest_common.sh@10 -- # set +x 00:07:17.904 ************************************ 00:07:17.904 START TEST accel_decomp 00:07:17.904 ************************************ 00:07:17.904 05:06:48 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:17.904 05:06:48 -- accel/accel.sh@16 -- # local accel_opc 00:07:17.904 05:06:48 -- accel/accel.sh@17 -- # local accel_module 00:07:17.904 05:06:48 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:17.904 05:06:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:17.904 05:06:48 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.904 05:06:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.904 05:06:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.904 05:06:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.904 05:06:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.904 05:06:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.904 05:06:48 -- accel/accel.sh@41 -- # local IFS=, 00:07:17.904 05:06:48 -- accel/accel.sh@42 -- # jq -r . 00:07:17.904 [2024-07-23 05:06:48.640049] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:17.904 [2024-07-23 05:06:48.640140] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3134549 ] 00:07:17.904 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.904 [2024-07-23 05:06:48.738264] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.904 [2024-07-23 05:06:48.819622] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.284 05:06:50 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:19.284 00:07:19.284 SPDK Configuration: 00:07:19.284 Core mask: 0x1 00:07:19.284 00:07:19.284 Accel Perf Configuration: 00:07:19.284 Workload Type: decompress 00:07:19.284 Transfer size: 4096 bytes 00:07:19.284 Vector count 1 00:07:19.284 Module: software 00:07:19.284 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.284 Queue depth: 32 00:07:19.284 Allocate depth: 32 00:07:19.284 # threads/core: 1 00:07:19.284 Run time: 1 seconds 00:07:19.284 Verify: Yes 00:07:19.284 00:07:19.284 Running for 1 seconds... 00:07:19.284 00:07:19.284 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:19.284 ------------------------------------------------------------------------------------ 00:07:19.284 0,0 67008/s 123 MiB/s 0 0 00:07:19.285 ==================================================================================== 00:07:19.285 Total 67008/s 261 MiB/s 0 0' 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:19.285 05:06:50 -- accel/accel.sh@12 -- # build_accel_config 00:07:19.285 05:06:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:19.285 05:06:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:19.285 05:06:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:19.285 05:06:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:19.285 05:06:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:19.285 05:06:50 -- accel/accel.sh@41 -- # local IFS=, 00:07:19.285 05:06:50 -- accel/accel.sh@42 -- # jq -r . 00:07:19.285 [2024-07-23 05:06:50.012810] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:19.285 [2024-07-23 05:06:50.012863] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3134821 ] 00:07:19.285 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.285 [2024-07-23 05:06:50.106327] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.285 [2024-07-23 05:06:50.190089] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val= 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val= 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val= 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val=0x1 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val= 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val= 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val=decompress 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val= 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val=software 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@23 -- # accel_module=software 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val=32 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val=32 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val=1 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val=Yes 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val= 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:19.285 05:06:50 -- accel/accel.sh@21 -- # val= 00:07:19.285 05:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:19.285 05:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:20.665 05:06:51 -- accel/accel.sh@21 -- # val= 00:07:20.665 05:06:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # IFS=: 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # read -r var val 00:07:20.665 05:06:51 -- accel/accel.sh@21 -- # val= 00:07:20.665 05:06:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # IFS=: 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # read -r var val 00:07:20.665 05:06:51 -- accel/accel.sh@21 -- # val= 00:07:20.665 05:06:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # IFS=: 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # read -r var val 00:07:20.665 05:06:51 -- accel/accel.sh@21 -- # val= 00:07:20.665 05:06:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # IFS=: 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # read -r var val 00:07:20.665 05:06:51 -- accel/accel.sh@21 -- # val= 00:07:20.665 05:06:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # IFS=: 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # read -r var val 00:07:20.665 05:06:51 -- accel/accel.sh@21 -- # val= 00:07:20.665 05:06:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # IFS=: 00:07:20.665 05:06:51 -- accel/accel.sh@20 -- # read -r var val 00:07:20.665 05:06:51 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:20.665 05:06:51 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:20.665 05:06:51 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:20.665 00:07:20.665 real 0m2.757s 00:07:20.665 user 0m2.463s 00:07:20.665 sys 0m0.290s 00:07:20.665 05:06:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.665 05:06:51 -- common/autotest_common.sh@10 -- # set +x 00:07:20.665 ************************************ 00:07:20.665 END TEST accel_decomp 00:07:20.665 ************************************ 00:07:20.665 05:06:51 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:20.665 05:06:51 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:20.665 05:06:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:20.665 05:06:51 -- common/autotest_common.sh@10 -- # set +x 00:07:20.665 ************************************ 00:07:20.665 START TEST accel_decmop_full 00:07:20.665 ************************************ 00:07:20.665 05:06:51 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:20.665 05:06:51 -- accel/accel.sh@16 -- # local accel_opc 00:07:20.665 05:06:51 -- accel/accel.sh@17 -- # local accel_module 00:07:20.665 05:06:51 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:20.665 05:06:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:20.665 05:06:51 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.665 05:06:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:20.665 05:06:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.665 05:06:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.665 05:06:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:20.665 05:06:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:20.665 05:06:51 -- accel/accel.sh@41 -- # local IFS=, 00:07:20.665 05:06:51 -- accel/accel.sh@42 -- # jq -r . 00:07:20.665 [2024-07-23 05:06:51.434750] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:20.665 [2024-07-23 05:06:51.434830] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3135102 ] 00:07:20.665 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.665 [2024-07-23 05:06:51.530866] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.665 [2024-07-23 05:06:51.613233] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.045 05:06:52 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:22.045 00:07:22.045 SPDK Configuration: 00:07:22.045 Core mask: 0x1 00:07:22.045 00:07:22.045 Accel Perf Configuration: 00:07:22.045 Workload Type: decompress 00:07:22.045 Transfer size: 111250 bytes 00:07:22.045 Vector count 1 00:07:22.045 Module: software 00:07:22.045 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:22.045 Queue depth: 32 00:07:22.045 Allocate depth: 32 00:07:22.045 # threads/core: 1 00:07:22.045 Run time: 1 seconds 00:07:22.045 Verify: Yes 00:07:22.045 00:07:22.045 Running for 1 seconds... 00:07:22.045 00:07:22.045 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:22.045 ------------------------------------------------------------------------------------ 00:07:22.045 0,0 4192/s 173 MiB/s 0 0 00:07:22.045 ==================================================================================== 00:07:22.045 Total 4192/s 444 MiB/s 0 0' 00:07:22.045 05:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:52 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:22.045 05:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:22.045 05:06:52 -- accel/accel.sh@12 -- # build_accel_config 00:07:22.045 05:06:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:22.045 05:06:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.045 05:06:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.045 05:06:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:22.045 05:06:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:22.045 05:06:52 -- accel/accel.sh@41 -- # local IFS=, 00:07:22.045 05:06:52 -- accel/accel.sh@42 -- # jq -r . 00:07:22.045 [2024-07-23 05:06:52.818806] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:22.045 [2024-07-23 05:06:52.818858] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3135381 ] 00:07:22.045 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.045 [2024-07-23 05:06:52.903804] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.045 [2024-07-23 05:06:52.984571] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val= 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val= 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val= 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val=0x1 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val= 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val= 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val=decompress 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val= 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val=software 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@23 -- # accel_module=software 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val=32 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val=32 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val=1 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.045 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.045 05:06:53 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:22.045 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.046 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.046 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.046 05:06:53 -- accel/accel.sh@21 -- # val=Yes 00:07:22.046 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.046 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.046 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.046 05:06:53 -- accel/accel.sh@21 -- # val= 00:07:22.046 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.046 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.046 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:22.046 05:06:53 -- accel/accel.sh@21 -- # val= 00:07:22.046 05:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.046 05:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:22.046 05:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:23.424 05:06:54 -- accel/accel.sh@21 -- # val= 00:07:23.424 05:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.424 05:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:23.424 05:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:23.424 05:06:54 -- accel/accel.sh@21 -- # val= 00:07:23.424 05:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.424 05:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:23.424 05:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:23.424 05:06:54 -- accel/accel.sh@21 -- # val= 00:07:23.424 05:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.424 05:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:23.424 05:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:23.424 05:06:54 -- accel/accel.sh@21 -- # val= 00:07:23.424 05:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.424 05:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:23.424 05:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:23.424 05:06:54 -- accel/accel.sh@21 -- # val= 00:07:23.424 05:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.424 05:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:23.424 05:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:23.424 05:06:54 -- accel/accel.sh@21 -- # val= 00:07:23.424 05:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.425 05:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:23.425 05:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:23.425 05:06:54 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:23.425 05:06:54 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:23.425 05:06:54 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:23.425 00:07:23.425 real 0m2.769s 00:07:23.425 user 0m2.481s 00:07:23.425 sys 0m0.284s 00:07:23.425 05:06:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.425 05:06:54 -- common/autotest_common.sh@10 -- # set +x 00:07:23.425 ************************************ 00:07:23.425 END TEST accel_decmop_full 00:07:23.425 ************************************ 00:07:23.425 05:06:54 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.425 05:06:54 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:23.425 05:06:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:23.425 05:06:54 -- common/autotest_common.sh@10 -- # set +x 00:07:23.425 ************************************ 00:07:23.425 START TEST accel_decomp_mcore 00:07:23.425 ************************************ 00:07:23.425 05:06:54 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.425 05:06:54 -- accel/accel.sh@16 -- # local accel_opc 00:07:23.425 05:06:54 -- accel/accel.sh@17 -- # local accel_module 00:07:23.425 05:06:54 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.425 05:06:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.425 05:06:54 -- accel/accel.sh@12 -- # build_accel_config 00:07:23.425 05:06:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:23.425 05:06:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.425 05:06:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.425 05:06:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:23.425 05:06:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:23.425 05:06:54 -- accel/accel.sh@41 -- # local IFS=, 00:07:23.425 05:06:54 -- accel/accel.sh@42 -- # jq -r . 00:07:23.425 [2024-07-23 05:06:54.243510] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:23.425 [2024-07-23 05:06:54.243589] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3135662 ] 00:07:23.425 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.425 [2024-07-23 05:06:54.339975] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:23.425 [2024-07-23 05:06:54.424889] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:23.425 [2024-07-23 05:06:54.424984] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:23.425 [2024-07-23 05:06:54.425071] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:23.425 [2024-07-23 05:06:54.425073] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.804 05:06:55 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:24.804 00:07:24.804 SPDK Configuration: 00:07:24.804 Core mask: 0xf 00:07:24.804 00:07:24.804 Accel Perf Configuration: 00:07:24.804 Workload Type: decompress 00:07:24.804 Transfer size: 4096 bytes 00:07:24.804 Vector count 1 00:07:24.804 Module: software 00:07:24.804 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.804 Queue depth: 32 00:07:24.804 Allocate depth: 32 00:07:24.804 # threads/core: 1 00:07:24.804 Run time: 1 seconds 00:07:24.804 Verify: Yes 00:07:24.804 00:07:24.804 Running for 1 seconds... 00:07:24.804 00:07:24.804 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:24.804 ------------------------------------------------------------------------------------ 00:07:24.804 0,0 58400/s 107 MiB/s 0 0 00:07:24.804 3,0 59104/s 108 MiB/s 0 0 00:07:24.804 2,0 78144/s 143 MiB/s 0 0 00:07:24.804 1,0 58976/s 108 MiB/s 0 0 00:07:24.804 ==================================================================================== 00:07:24.804 Total 254624/s 994 MiB/s 0 0' 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.804 05:06:55 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:24.804 05:06:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:24.804 05:06:55 -- accel/accel.sh@12 -- # build_accel_config 00:07:24.804 05:06:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:24.804 05:06:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.804 05:06:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.804 05:06:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:24.804 05:06:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:24.804 05:06:55 -- accel/accel.sh@41 -- # local IFS=, 00:07:24.804 05:06:55 -- accel/accel.sh@42 -- # jq -r . 00:07:24.804 [2024-07-23 05:06:55.638581] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:24.804 [2024-07-23 05:06:55.638657] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3135915 ] 00:07:24.804 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.804 [2024-07-23 05:06:55.735270] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:24.804 [2024-07-23 05:06:55.821690] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:24.804 [2024-07-23 05:06:55.821783] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:24.804 [2024-07-23 05:06:55.821867] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:24.804 [2024-07-23 05:06:55.821871] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.804 05:06:55 -- accel/accel.sh@21 -- # val= 00:07:24.804 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.804 05:06:55 -- accel/accel.sh@21 -- # val= 00:07:24.804 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.804 05:06:55 -- accel/accel.sh@21 -- # val= 00:07:24.804 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.804 05:06:55 -- accel/accel.sh@21 -- # val=0xf 00:07:24.804 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.804 05:06:55 -- accel/accel.sh@21 -- # val= 00:07:24.804 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.804 05:06:55 -- accel/accel.sh@21 -- # val= 00:07:24.804 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.804 05:06:55 -- accel/accel.sh@21 -- # val=decompress 00:07:24.804 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.804 05:06:55 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.804 05:06:55 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:24.804 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.804 05:06:55 -- accel/accel.sh@21 -- # val= 00:07:24.804 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.804 05:06:55 -- accel/accel.sh@21 -- # val=software 00:07:24.804 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.804 05:06:55 -- accel/accel.sh@23 -- # accel_module=software 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.804 05:06:55 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.804 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.804 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.805 05:06:55 -- accel/accel.sh@21 -- # val=32 00:07:24.805 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.805 05:06:55 -- accel/accel.sh@21 -- # val=32 00:07:24.805 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.805 05:06:55 -- accel/accel.sh@21 -- # val=1 00:07:24.805 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.805 05:06:55 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:24.805 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.805 05:06:55 -- accel/accel.sh@21 -- # val=Yes 00:07:24.805 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.805 05:06:55 -- accel/accel.sh@21 -- # val= 00:07:24.805 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:24.805 05:06:55 -- accel/accel.sh@21 -- # val= 00:07:24.805 05:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:24.805 05:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:26.184 05:06:57 -- accel/accel.sh@21 -- # val= 00:07:26.184 05:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:26.184 05:06:57 -- accel/accel.sh@21 -- # val= 00:07:26.184 05:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:26.184 05:06:57 -- accel/accel.sh@21 -- # val= 00:07:26.184 05:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:26.184 05:06:57 -- accel/accel.sh@21 -- # val= 00:07:26.184 05:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:26.184 05:06:57 -- accel/accel.sh@21 -- # val= 00:07:26.184 05:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:26.184 05:06:57 -- accel/accel.sh@21 -- # val= 00:07:26.184 05:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:26.184 05:06:57 -- accel/accel.sh@21 -- # val= 00:07:26.184 05:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:26.184 05:06:57 -- accel/accel.sh@21 -- # val= 00:07:26.184 05:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:26.184 05:06:57 -- accel/accel.sh@21 -- # val= 00:07:26.184 05:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:26.184 05:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:26.184 05:06:57 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:26.184 05:06:57 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:26.184 05:06:57 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:26.184 00:07:26.184 real 0m2.801s 00:07:26.184 user 0m9.173s 00:07:26.184 sys 0m0.331s 00:07:26.184 05:06:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:26.184 05:06:57 -- common/autotest_common.sh@10 -- # set +x 00:07:26.184 ************************************ 00:07:26.184 END TEST accel_decomp_mcore 00:07:26.184 ************************************ 00:07:26.184 05:06:57 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:26.184 05:06:57 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:26.184 05:06:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:26.184 05:06:57 -- common/autotest_common.sh@10 -- # set +x 00:07:26.184 ************************************ 00:07:26.184 START TEST accel_decomp_full_mcore 00:07:26.184 ************************************ 00:07:26.184 05:06:57 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:26.184 05:06:57 -- accel/accel.sh@16 -- # local accel_opc 00:07:26.184 05:06:57 -- accel/accel.sh@17 -- # local accel_module 00:07:26.184 05:06:57 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:26.184 05:06:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:26.184 05:06:57 -- accel/accel.sh@12 -- # build_accel_config 00:07:26.184 05:06:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:26.184 05:06:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.184 05:06:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.184 05:06:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:26.184 05:06:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:26.184 05:06:57 -- accel/accel.sh@41 -- # local IFS=, 00:07:26.184 05:06:57 -- accel/accel.sh@42 -- # jq -r . 00:07:26.184 [2024-07-23 05:06:57.094079] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:26.184 [2024-07-23 05:06:57.094162] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3136154 ] 00:07:26.184 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.184 [2024-07-23 05:06:57.190965] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:26.443 [2024-07-23 05:06:57.277001] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:26.443 [2024-07-23 05:06:57.277096] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:26.443 [2024-07-23 05:06:57.277160] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:26.443 [2024-07-23 05:06:57.277163] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.420 05:06:58 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:27.420 00:07:27.420 SPDK Configuration: 00:07:27.420 Core mask: 0xf 00:07:27.420 00:07:27.420 Accel Perf Configuration: 00:07:27.420 Workload Type: decompress 00:07:27.420 Transfer size: 111250 bytes 00:07:27.420 Vector count 1 00:07:27.420 Module: software 00:07:27.420 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:27.420 Queue depth: 32 00:07:27.420 Allocate depth: 32 00:07:27.420 # threads/core: 1 00:07:27.420 Run time: 1 seconds 00:07:27.420 Verify: Yes 00:07:27.420 00:07:27.420 Running for 1 seconds... 00:07:27.420 00:07:27.420 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:27.420 ------------------------------------------------------------------------------------ 00:07:27.420 0,0 4160/s 171 MiB/s 0 0 00:07:27.420 3,0 4160/s 171 MiB/s 0 0 00:07:27.420 2,0 5760/s 237 MiB/s 0 0 00:07:27.420 1,0 4160/s 171 MiB/s 0 0 00:07:27.420 ==================================================================================== 00:07:27.420 Total 18240/s 1935 MiB/s 0 0' 00:07:27.420 05:06:58 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:27.420 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.420 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.420 05:06:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:27.420 05:06:58 -- accel/accel.sh@12 -- # build_accel_config 00:07:27.420 05:06:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:27.421 05:06:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.421 05:06:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.421 05:06:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:27.421 05:06:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:27.421 05:06:58 -- accel/accel.sh@41 -- # local IFS=, 00:07:27.421 05:06:58 -- accel/accel.sh@42 -- # jq -r . 00:07:27.421 [2024-07-23 05:06:58.494270] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:27.421 [2024-07-23 05:06:58.494335] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3136369 ] 00:07:27.680 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.680 [2024-07-23 05:06:58.578105] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:27.680 [2024-07-23 05:06:58.669928] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:27.680 [2024-07-23 05:06:58.670021] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:27.680 [2024-07-23 05:06:58.670103] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:27.680 [2024-07-23 05:06:58.670107] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val= 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val= 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val= 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val=0xf 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val= 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val= 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val=decompress 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.680 05:06:58 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val= 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val=software 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.680 05:06:58 -- accel/accel.sh@23 -- # accel_module=software 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.680 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.680 05:06:58 -- accel/accel.sh@21 -- # val=32 00:07:27.680 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.681 05:06:58 -- accel/accel.sh@21 -- # val=32 00:07:27.681 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.681 05:06:58 -- accel/accel.sh@21 -- # val=1 00:07:27.681 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.681 05:06:58 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:27.681 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.681 05:06:58 -- accel/accel.sh@21 -- # val=Yes 00:07:27.681 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.681 05:06:58 -- accel/accel.sh@21 -- # val= 00:07:27.681 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:27.681 05:06:58 -- accel/accel.sh@21 -- # val= 00:07:27.681 05:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:27.681 05:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:29.060 05:06:59 -- accel/accel.sh@21 -- # val= 00:07:29.060 05:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.060 05:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:29.060 05:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:29.060 05:06:59 -- accel/accel.sh@21 -- # val= 00:07:29.060 05:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.060 05:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:29.060 05:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:29.060 05:06:59 -- accel/accel.sh@21 -- # val= 00:07:29.060 05:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.060 05:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:29.060 05:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:29.061 05:06:59 -- accel/accel.sh@21 -- # val= 00:07:29.061 05:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:29.061 05:06:59 -- accel/accel.sh@21 -- # val= 00:07:29.061 05:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:29.061 05:06:59 -- accel/accel.sh@21 -- # val= 00:07:29.061 05:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:29.061 05:06:59 -- accel/accel.sh@21 -- # val= 00:07:29.061 05:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:29.061 05:06:59 -- accel/accel.sh@21 -- # val= 00:07:29.061 05:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:29.061 05:06:59 -- accel/accel.sh@21 -- # val= 00:07:29.061 05:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:29.061 05:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:29.061 05:06:59 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:29.061 05:06:59 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:29.061 05:06:59 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:29.061 00:07:29.061 real 0m2.809s 00:07:29.061 user 0m9.252s 00:07:29.061 sys 0m0.317s 00:07:29.061 05:06:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:29.061 05:06:59 -- common/autotest_common.sh@10 -- # set +x 00:07:29.061 ************************************ 00:07:29.061 END TEST accel_decomp_full_mcore 00:07:29.061 ************************************ 00:07:29.061 05:06:59 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:29.061 05:06:59 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:29.061 05:06:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:29.061 05:06:59 -- common/autotest_common.sh@10 -- # set +x 00:07:29.061 ************************************ 00:07:29.061 START TEST accel_decomp_mthread 00:07:29.061 ************************************ 00:07:29.061 05:06:59 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:29.061 05:06:59 -- accel/accel.sh@16 -- # local accel_opc 00:07:29.061 05:06:59 -- accel/accel.sh@17 -- # local accel_module 00:07:29.061 05:06:59 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:29.061 05:06:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:29.061 05:06:59 -- accel/accel.sh@12 -- # build_accel_config 00:07:29.061 05:06:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:29.061 05:06:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.061 05:06:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.061 05:06:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:29.061 05:06:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:29.061 05:06:59 -- accel/accel.sh@41 -- # local IFS=, 00:07:29.061 05:06:59 -- accel/accel.sh@42 -- # jq -r . 00:07:29.061 [2024-07-23 05:06:59.952090] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:29.061 [2024-07-23 05:06:59.952186] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3136607 ] 00:07:29.061 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.061 [2024-07-23 05:07:00.052615] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.061 [2024-07-23 05:07:00.140827] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.440 05:07:01 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:30.440 00:07:30.440 SPDK Configuration: 00:07:30.440 Core mask: 0x1 00:07:30.440 00:07:30.440 Accel Perf Configuration: 00:07:30.440 Workload Type: decompress 00:07:30.440 Transfer size: 4096 bytes 00:07:30.440 Vector count 1 00:07:30.440 Module: software 00:07:30.440 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:30.440 Queue depth: 32 00:07:30.440 Allocate depth: 32 00:07:30.440 # threads/core: 2 00:07:30.440 Run time: 1 seconds 00:07:30.440 Verify: Yes 00:07:30.440 00:07:30.440 Running for 1 seconds... 00:07:30.440 00:07:30.440 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:30.440 ------------------------------------------------------------------------------------ 00:07:30.440 0,1 33952/s 62 MiB/s 0 0 00:07:30.440 0,0 33856/s 62 MiB/s 0 0 00:07:30.440 ==================================================================================== 00:07:30.440 Total 67808/s 264 MiB/s 0 0' 00:07:30.440 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.440 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.440 05:07:01 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:30.440 05:07:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:30.440 05:07:01 -- accel/accel.sh@12 -- # build_accel_config 00:07:30.440 05:07:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:30.440 05:07:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:30.440 05:07:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:30.440 05:07:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:30.440 05:07:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:30.440 05:07:01 -- accel/accel.sh@41 -- # local IFS=, 00:07:30.440 05:07:01 -- accel/accel.sh@42 -- # jq -r . 00:07:30.440 [2024-07-23 05:07:01.351640] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:30.440 [2024-07-23 05:07:01.351723] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3136815 ] 00:07:30.440 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.440 [2024-07-23 05:07:01.447721] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.440 [2024-07-23 05:07:01.529109] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.699 05:07:01 -- accel/accel.sh@21 -- # val= 00:07:30.699 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.699 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.699 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.699 05:07:01 -- accel/accel.sh@21 -- # val= 00:07:30.699 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.699 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.699 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.699 05:07:01 -- accel/accel.sh@21 -- # val= 00:07:30.699 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val=0x1 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val= 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val= 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val=decompress 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val= 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val=software 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@23 -- # accel_module=software 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val=32 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val=32 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val=2 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val=Yes 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val= 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:30.700 05:07:01 -- accel/accel.sh@21 -- # val= 00:07:30.700 05:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:30.700 05:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:31.639 05:07:02 -- accel/accel.sh@21 -- # val= 00:07:31.639 05:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:31.639 05:07:02 -- accel/accel.sh@21 -- # val= 00:07:31.639 05:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:31.639 05:07:02 -- accel/accel.sh@21 -- # val= 00:07:31.639 05:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:31.639 05:07:02 -- accel/accel.sh@21 -- # val= 00:07:31.639 05:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:31.639 05:07:02 -- accel/accel.sh@21 -- # val= 00:07:31.639 05:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:31.639 05:07:02 -- accel/accel.sh@21 -- # val= 00:07:31.639 05:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:31.639 05:07:02 -- accel/accel.sh@21 -- # val= 00:07:31.639 05:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:31.639 05:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:31.639 05:07:02 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:31.639 05:07:02 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:31.639 05:07:02 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:31.639 00:07:31.639 real 0m2.794s 00:07:31.639 user 0m2.494s 00:07:31.639 sys 0m0.308s 00:07:31.639 05:07:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.639 05:07:02 -- common/autotest_common.sh@10 -- # set +x 00:07:31.639 ************************************ 00:07:31.639 END TEST accel_decomp_mthread 00:07:31.639 ************************************ 00:07:31.898 05:07:02 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:31.898 05:07:02 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:31.898 05:07:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:31.898 05:07:02 -- common/autotest_common.sh@10 -- # set +x 00:07:31.898 ************************************ 00:07:31.898 START TEST accel_deomp_full_mthread 00:07:31.898 ************************************ 00:07:31.898 05:07:02 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:31.898 05:07:02 -- accel/accel.sh@16 -- # local accel_opc 00:07:31.898 05:07:02 -- accel/accel.sh@17 -- # local accel_module 00:07:31.898 05:07:02 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:31.898 05:07:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:31.898 05:07:02 -- accel/accel.sh@12 -- # build_accel_config 00:07:31.898 05:07:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:31.898 05:07:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.898 05:07:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.898 05:07:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:31.898 05:07:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:31.898 05:07:02 -- accel/accel.sh@41 -- # local IFS=, 00:07:31.898 05:07:02 -- accel/accel.sh@42 -- # jq -r . 00:07:31.898 [2024-07-23 05:07:02.796944] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:31.898 [2024-07-23 05:07:02.797032] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3137100 ] 00:07:31.898 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.898 [2024-07-23 05:07:02.895348] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.898 [2024-07-23 05:07:02.978194] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.276 05:07:04 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:33.276 00:07:33.276 SPDK Configuration: 00:07:33.276 Core mask: 0x1 00:07:33.276 00:07:33.276 Accel Perf Configuration: 00:07:33.276 Workload Type: decompress 00:07:33.276 Transfer size: 111250 bytes 00:07:33.276 Vector count 1 00:07:33.276 Module: software 00:07:33.276 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:33.276 Queue depth: 32 00:07:33.276 Allocate depth: 32 00:07:33.276 # threads/core: 2 00:07:33.276 Run time: 1 seconds 00:07:33.276 Verify: Yes 00:07:33.276 00:07:33.276 Running for 1 seconds... 00:07:33.276 00:07:33.276 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:33.276 ------------------------------------------------------------------------------------ 00:07:33.276 0,1 2144/s 88 MiB/s 0 0 00:07:33.276 0,0 2112/s 87 MiB/s 0 0 00:07:33.276 ==================================================================================== 00:07:33.276 Total 4256/s 451 MiB/s 0 0' 00:07:33.276 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.276 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.276 05:07:04 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:33.276 05:07:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:33.276 05:07:04 -- accel/accel.sh@12 -- # build_accel_config 00:07:33.276 05:07:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:33.276 05:07:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:33.276 05:07:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:33.276 05:07:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:33.276 05:07:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:33.276 05:07:04 -- accel/accel.sh@41 -- # local IFS=, 00:07:33.276 05:07:04 -- accel/accel.sh@42 -- # jq -r . 00:07:33.276 [2024-07-23 05:07:04.215968] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:33.276 [2024-07-23 05:07:04.216066] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3137370 ] 00:07:33.276 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.276 [2024-07-23 05:07:04.314218] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.536 [2024-07-23 05:07:04.396627] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val= 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val= 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val= 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val=0x1 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val= 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val= 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val=decompress 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val= 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val=software 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@23 -- # accel_module=software 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val=32 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val=32 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val=2 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val=Yes 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val= 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:33.536 05:07:04 -- accel/accel.sh@21 -- # val= 00:07:33.536 05:07:04 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # IFS=: 00:07:33.536 05:07:04 -- accel/accel.sh@20 -- # read -r var val 00:07:34.916 05:07:05 -- accel/accel.sh@21 -- # val= 00:07:34.916 05:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:34.916 05:07:05 -- accel/accel.sh@21 -- # val= 00:07:34.916 05:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:34.916 05:07:05 -- accel/accel.sh@21 -- # val= 00:07:34.916 05:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:34.916 05:07:05 -- accel/accel.sh@21 -- # val= 00:07:34.916 05:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:34.916 05:07:05 -- accel/accel.sh@21 -- # val= 00:07:34.916 05:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:34.916 05:07:05 -- accel/accel.sh@21 -- # val= 00:07:34.916 05:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:34.916 05:07:05 -- accel/accel.sh@21 -- # val= 00:07:34.916 05:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:34.916 05:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:34.916 05:07:05 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:34.916 05:07:05 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:34.916 05:07:05 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:34.916 00:07:34.916 real 0m2.845s 00:07:34.916 user 0m2.531s 00:07:34.916 sys 0m0.320s 00:07:34.916 05:07:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.916 05:07:05 -- common/autotest_common.sh@10 -- # set +x 00:07:34.916 ************************************ 00:07:34.916 END TEST accel_deomp_full_mthread 00:07:34.916 ************************************ 00:07:34.916 05:07:05 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:34.916 05:07:05 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:34.916 05:07:05 -- accel/accel.sh@129 -- # build_accel_config 00:07:34.916 05:07:05 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:34.916 05:07:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:34.916 05:07:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:34.916 05:07:05 -- common/autotest_common.sh@10 -- # set +x 00:07:34.916 05:07:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:34.916 05:07:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:34.916 05:07:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:34.916 05:07:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:34.916 05:07:05 -- accel/accel.sh@41 -- # local IFS=, 00:07:34.916 05:07:05 -- accel/accel.sh@42 -- # jq -r . 00:07:34.916 ************************************ 00:07:34.916 START TEST accel_dif_functional_tests 00:07:34.916 ************************************ 00:07:34.916 05:07:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:34.916 [2024-07-23 05:07:05.693944] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:34.916 [2024-07-23 05:07:05.694036] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3137655 ] 00:07:34.916 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.916 [2024-07-23 05:07:05.794369] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:34.916 [2024-07-23 05:07:05.879228] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:34.916 [2024-07-23 05:07:05.879321] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:34.916 [2024-07-23 05:07:05.879325] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.916 00:07:34.916 00:07:34.916 CUnit - A unit testing framework for C - Version 2.1-3 00:07:34.916 http://cunit.sourceforge.net/ 00:07:34.916 00:07:34.916 00:07:34.916 Suite: accel_dif 00:07:34.916 Test: verify: DIF generated, GUARD check ...passed 00:07:34.916 Test: verify: DIF generated, APPTAG check ...passed 00:07:34.916 Test: verify: DIF generated, REFTAG check ...passed 00:07:34.916 Test: verify: DIF not generated, GUARD check ...[2024-07-23 05:07:05.951376] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:34.916 [2024-07-23 05:07:05.951431] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:34.916 passed 00:07:34.916 Test: verify: DIF not generated, APPTAG check ...[2024-07-23 05:07:05.951479] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:34.916 [2024-07-23 05:07:05.951504] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:34.916 passed 00:07:34.916 Test: verify: DIF not generated, REFTAG check ...[2024-07-23 05:07:05.951532] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:34.916 [2024-07-23 05:07:05.951555] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:34.916 passed 00:07:34.916 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:34.916 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-23 05:07:05.951614] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:34.916 passed 00:07:34.916 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:34.916 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:34.916 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:34.916 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-23 05:07:05.951744] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:34.916 passed 00:07:34.916 Test: generate copy: DIF generated, GUARD check ...passed 00:07:34.916 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:34.916 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:34.916 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:34.916 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:34.916 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:34.916 Test: generate copy: iovecs-len validate ...[2024-07-23 05:07:05.951964] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:34.916 passed 00:07:34.916 Test: generate copy: buffer alignment validate ...passed 00:07:34.916 00:07:34.916 Run Summary: Type Total Ran Passed Failed Inactive 00:07:34.916 suites 1 1 n/a 0 0 00:07:34.916 tests 20 20 20 0 0 00:07:34.916 asserts 204 204 204 0 n/a 00:07:34.916 00:07:34.916 Elapsed time = 0.002 seconds 00:07:35.175 00:07:35.175 real 0m0.457s 00:07:35.175 user 0m0.648s 00:07:35.175 sys 0m0.181s 00:07:35.175 05:07:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.175 05:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:35.175 ************************************ 00:07:35.175 END TEST accel_dif_functional_tests 00:07:35.175 ************************************ 00:07:35.175 00:07:35.175 real 0m59.668s 00:07:35.175 user 1m6.218s 00:07:35.175 sys 0m8.250s 00:07:35.175 05:07:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.175 05:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:35.175 ************************************ 00:07:35.175 END TEST accel 00:07:35.175 ************************************ 00:07:35.175 05:07:06 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:35.175 05:07:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:35.175 05:07:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:35.175 05:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:35.175 ************************************ 00:07:35.175 START TEST accel_rpc 00:07:35.175 ************************************ 00:07:35.175 05:07:06 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:35.435 * Looking for test storage... 00:07:35.435 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:35.435 05:07:06 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:35.435 05:07:06 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:35.435 05:07:06 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=3137857 00:07:35.435 05:07:06 -- accel/accel_rpc.sh@15 -- # waitforlisten 3137857 00:07:35.435 05:07:06 -- common/autotest_common.sh@819 -- # '[' -z 3137857 ']' 00:07:35.435 05:07:06 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:35.435 05:07:06 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:35.435 05:07:06 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:35.435 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:35.435 05:07:06 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:35.435 05:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:35.435 [2024-07-23 05:07:06.334481] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:35.435 [2024-07-23 05:07:06.334536] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3137857 ] 00:07:35.435 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.435 [2024-07-23 05:07:06.420853] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.435 [2024-07-23 05:07:06.509603] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:35.435 [2024-07-23 05:07:06.509732] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.694 05:07:06 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:35.694 05:07:06 -- common/autotest_common.sh@852 -- # return 0 00:07:35.694 05:07:06 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:35.694 05:07:06 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:35.694 05:07:06 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:35.694 05:07:06 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:35.694 05:07:06 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:35.694 05:07:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:35.694 05:07:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:35.694 05:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:35.694 ************************************ 00:07:35.694 START TEST accel_assign_opcode 00:07:35.694 ************************************ 00:07:35.694 05:07:06 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:07:35.694 05:07:06 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:35.694 05:07:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:35.694 05:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:35.695 [2024-07-23 05:07:06.562262] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:35.695 05:07:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:35.695 05:07:06 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:35.695 05:07:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:35.695 05:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:35.695 [2024-07-23 05:07:06.570277] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:35.695 05:07:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:35.695 05:07:06 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:35.695 05:07:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:35.695 05:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:35.695 05:07:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:35.695 05:07:06 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:35.695 05:07:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:35.695 05:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:35.695 05:07:06 -- accel/accel_rpc.sh@42 -- # grep software 00:07:35.695 05:07:06 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:35.695 05:07:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:35.954 software 00:07:35.954 00:07:35.954 real 0m0.248s 00:07:35.954 user 0m0.043s 00:07:35.954 sys 0m0.009s 00:07:35.954 05:07:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.954 05:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:35.954 ************************************ 00:07:35.954 END TEST accel_assign_opcode 00:07:35.954 ************************************ 00:07:35.954 05:07:06 -- accel/accel_rpc.sh@55 -- # killprocess 3137857 00:07:35.954 05:07:06 -- common/autotest_common.sh@926 -- # '[' -z 3137857 ']' 00:07:35.954 05:07:06 -- common/autotest_common.sh@930 -- # kill -0 3137857 00:07:35.954 05:07:06 -- common/autotest_common.sh@931 -- # uname 00:07:35.954 05:07:06 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:35.954 05:07:06 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3137857 00:07:35.954 05:07:06 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:35.954 05:07:06 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:35.954 05:07:06 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3137857' 00:07:35.954 killing process with pid 3137857 00:07:35.954 05:07:06 -- common/autotest_common.sh@945 -- # kill 3137857 00:07:35.954 05:07:06 -- common/autotest_common.sh@950 -- # wait 3137857 00:07:36.214 00:07:36.214 real 0m1.001s 00:07:36.214 user 0m0.935s 00:07:36.214 sys 0m0.443s 00:07:36.214 05:07:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.214 05:07:07 -- common/autotest_common.sh@10 -- # set +x 00:07:36.214 ************************************ 00:07:36.214 END TEST accel_rpc 00:07:36.214 ************************************ 00:07:36.214 05:07:07 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:36.214 05:07:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:36.214 05:07:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:36.214 05:07:07 -- common/autotest_common.sh@10 -- # set +x 00:07:36.214 ************************************ 00:07:36.214 START TEST app_cmdline 00:07:36.214 ************************************ 00:07:36.214 05:07:07 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:36.474 * Looking for test storage... 00:07:36.474 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:36.474 05:07:07 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:36.474 05:07:07 -- app/cmdline.sh@17 -- # spdk_tgt_pid=3138051 00:07:36.474 05:07:07 -- app/cmdline.sh@18 -- # waitforlisten 3138051 00:07:36.474 05:07:07 -- common/autotest_common.sh@819 -- # '[' -z 3138051 ']' 00:07:36.474 05:07:07 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:36.474 05:07:07 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:36.474 05:07:07 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:36.474 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:36.474 05:07:07 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:36.474 05:07:07 -- common/autotest_common.sh@10 -- # set +x 00:07:36.474 05:07:07 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:36.474 [2024-07-23 05:07:07.378065] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:36.474 [2024-07-23 05:07:07.378129] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3138051 ] 00:07:36.474 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.474 [2024-07-23 05:07:07.473759] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.474 [2024-07-23 05:07:07.559932] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:36.474 [2024-07-23 05:07:07.560062] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.412 05:07:08 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:37.412 05:07:08 -- common/autotest_common.sh@852 -- # return 0 00:07:37.412 05:07:08 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:37.412 { 00:07:37.412 "version": "SPDK v24.01.1-pre git sha1 dbef7efac", 00:07:37.412 "fields": { 00:07:37.412 "major": 24, 00:07:37.412 "minor": 1, 00:07:37.412 "patch": 1, 00:07:37.412 "suffix": "-pre", 00:07:37.412 "commit": "dbef7efac" 00:07:37.412 } 00:07:37.412 } 00:07:37.412 05:07:08 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:37.412 05:07:08 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:37.412 05:07:08 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:37.412 05:07:08 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:37.412 05:07:08 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:37.412 05:07:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:37.412 05:07:08 -- common/autotest_common.sh@10 -- # set +x 00:07:37.412 05:07:08 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:37.412 05:07:08 -- app/cmdline.sh@26 -- # sort 00:07:37.412 05:07:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:37.412 05:07:08 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:37.412 05:07:08 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:37.412 05:07:08 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:37.412 05:07:08 -- common/autotest_common.sh@640 -- # local es=0 00:07:37.412 05:07:08 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:37.412 05:07:08 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:37.412 05:07:08 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:37.412 05:07:08 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:37.412 05:07:08 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:37.412 05:07:08 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:37.412 05:07:08 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:37.412 05:07:08 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:37.412 05:07:08 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:37.412 05:07:08 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:37.672 request: 00:07:37.672 { 00:07:37.672 "method": "env_dpdk_get_mem_stats", 00:07:37.672 "req_id": 1 00:07:37.672 } 00:07:37.672 Got JSON-RPC error response 00:07:37.672 response: 00:07:37.672 { 00:07:37.672 "code": -32601, 00:07:37.672 "message": "Method not found" 00:07:37.672 } 00:07:37.672 05:07:08 -- common/autotest_common.sh@643 -- # es=1 00:07:37.672 05:07:08 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:37.672 05:07:08 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:37.672 05:07:08 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:37.672 05:07:08 -- app/cmdline.sh@1 -- # killprocess 3138051 00:07:37.672 05:07:08 -- common/autotest_common.sh@926 -- # '[' -z 3138051 ']' 00:07:37.672 05:07:08 -- common/autotest_common.sh@930 -- # kill -0 3138051 00:07:37.672 05:07:08 -- common/autotest_common.sh@931 -- # uname 00:07:37.672 05:07:08 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:37.672 05:07:08 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3138051 00:07:37.672 05:07:08 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:37.672 05:07:08 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:37.672 05:07:08 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3138051' 00:07:37.672 killing process with pid 3138051 00:07:37.672 05:07:08 -- common/autotest_common.sh@945 -- # kill 3138051 00:07:37.672 05:07:08 -- common/autotest_common.sh@950 -- # wait 3138051 00:07:38.241 00:07:38.241 real 0m1.796s 00:07:38.241 user 0m2.127s 00:07:38.241 sys 0m0.532s 00:07:38.241 05:07:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.241 05:07:09 -- common/autotest_common.sh@10 -- # set +x 00:07:38.241 ************************************ 00:07:38.241 END TEST app_cmdline 00:07:38.241 ************************************ 00:07:38.241 05:07:09 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:38.241 05:07:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:38.241 05:07:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:38.241 05:07:09 -- common/autotest_common.sh@10 -- # set +x 00:07:38.241 ************************************ 00:07:38.241 START TEST version 00:07:38.241 ************************************ 00:07:38.241 05:07:09 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:38.241 * Looking for test storage... 00:07:38.241 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:38.241 05:07:09 -- app/version.sh@17 -- # get_header_version major 00:07:38.241 05:07:09 -- app/version.sh@14 -- # cut -f2 00:07:38.241 05:07:09 -- app/version.sh@14 -- # tr -d '"' 00:07:38.241 05:07:09 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:38.241 05:07:09 -- app/version.sh@17 -- # major=24 00:07:38.241 05:07:09 -- app/version.sh@18 -- # get_header_version minor 00:07:38.241 05:07:09 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:38.241 05:07:09 -- app/version.sh@14 -- # cut -f2 00:07:38.241 05:07:09 -- app/version.sh@14 -- # tr -d '"' 00:07:38.241 05:07:09 -- app/version.sh@18 -- # minor=1 00:07:38.241 05:07:09 -- app/version.sh@19 -- # get_header_version patch 00:07:38.241 05:07:09 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:38.241 05:07:09 -- app/version.sh@14 -- # cut -f2 00:07:38.241 05:07:09 -- app/version.sh@14 -- # tr -d '"' 00:07:38.241 05:07:09 -- app/version.sh@19 -- # patch=1 00:07:38.241 05:07:09 -- app/version.sh@20 -- # get_header_version suffix 00:07:38.241 05:07:09 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:38.241 05:07:09 -- app/version.sh@14 -- # cut -f2 00:07:38.241 05:07:09 -- app/version.sh@14 -- # tr -d '"' 00:07:38.241 05:07:09 -- app/version.sh@20 -- # suffix=-pre 00:07:38.241 05:07:09 -- app/version.sh@22 -- # version=24.1 00:07:38.241 05:07:09 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:38.241 05:07:09 -- app/version.sh@25 -- # version=24.1.1 00:07:38.241 05:07:09 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:38.241 05:07:09 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:38.241 05:07:09 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:38.241 05:07:09 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:38.241 05:07:09 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:38.241 00:07:38.241 real 0m0.186s 00:07:38.241 user 0m0.083s 00:07:38.241 sys 0m0.141s 00:07:38.241 05:07:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.241 05:07:09 -- common/autotest_common.sh@10 -- # set +x 00:07:38.241 ************************************ 00:07:38.241 END TEST version 00:07:38.241 ************************************ 00:07:38.501 05:07:09 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@204 -- # uname -s 00:07:38.501 05:07:09 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:07:38.501 05:07:09 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:38.501 05:07:09 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:38.501 05:07:09 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@268 -- # timing_exit lib 00:07:38.501 05:07:09 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:38.501 05:07:09 -- common/autotest_common.sh@10 -- # set +x 00:07:38.501 05:07:09 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:38.501 05:07:09 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:38.501 05:07:09 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:38.501 05:07:09 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:38.501 05:07:09 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:38.501 05:07:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:38.501 05:07:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:38.501 05:07:09 -- common/autotest_common.sh@10 -- # set +x 00:07:38.501 ************************************ 00:07:38.501 START TEST llvm_fuzz 00:07:38.501 ************************************ 00:07:38.501 05:07:09 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:38.501 * Looking for test storage... 00:07:38.501 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:38.501 05:07:09 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:38.501 05:07:09 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:38.501 05:07:09 -- common/autotest_common.sh@538 -- # fuzzers=() 00:07:38.501 05:07:09 -- common/autotest_common.sh@538 -- # local fuzzers 00:07:38.501 05:07:09 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:07:38.501 05:07:09 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:38.501 05:07:09 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:38.501 05:07:09 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:38.501 05:07:09 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:38.501 05:07:09 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:38.501 05:07:09 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:07:38.501 05:07:09 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:38.501 05:07:09 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:38.501 05:07:09 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:38.501 05:07:09 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:38.501 05:07:09 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:38.501 05:07:09 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:38.501 05:07:09 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:38.501 05:07:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:38.501 05:07:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:38.501 05:07:09 -- common/autotest_common.sh@10 -- # set +x 00:07:38.501 ************************************ 00:07:38.501 START TEST nvmf_fuzz 00:07:38.501 ************************************ 00:07:38.501 05:07:09 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:38.763 * Looking for test storage... 00:07:38.763 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:38.763 05:07:09 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:38.763 05:07:09 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:38.763 05:07:09 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:38.763 05:07:09 -- common/autotest_common.sh@34 -- # set -e 00:07:38.763 05:07:09 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:38.763 05:07:09 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:38.763 05:07:09 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:38.763 05:07:09 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:38.763 05:07:09 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:38.763 05:07:09 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:38.763 05:07:09 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:38.763 05:07:09 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:38.763 05:07:09 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:38.763 05:07:09 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:38.763 05:07:09 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:38.763 05:07:09 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:38.763 05:07:09 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:38.763 05:07:09 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:38.763 05:07:09 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:38.763 05:07:09 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:38.763 05:07:09 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:38.763 05:07:09 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:38.763 05:07:09 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:38.763 05:07:09 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:38.763 05:07:09 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:38.763 05:07:09 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:38.763 05:07:09 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:38.763 05:07:09 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:38.763 05:07:09 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:38.763 05:07:09 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:38.763 05:07:09 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:38.763 05:07:09 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:38.763 05:07:09 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:38.763 05:07:09 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:38.763 05:07:09 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:38.763 05:07:09 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:38.763 05:07:09 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:38.763 05:07:09 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:38.763 05:07:09 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:38.763 05:07:09 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:38.763 05:07:09 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:38.763 05:07:09 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:38.763 05:07:09 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:38.763 05:07:09 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:38.763 05:07:09 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:38.763 05:07:09 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:38.763 05:07:09 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:38.763 05:07:09 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:38.763 05:07:09 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:07:38.763 05:07:09 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:38.763 05:07:09 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:38.763 05:07:09 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:38.763 05:07:09 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:38.763 05:07:09 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:38.763 05:07:09 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:38.763 05:07:09 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:38.763 05:07:09 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:38.763 05:07:09 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:38.763 05:07:09 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:38.763 05:07:09 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:38.763 05:07:09 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:38.763 05:07:09 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:38.763 05:07:09 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:38.763 05:07:09 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:38.763 05:07:09 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:38.763 05:07:09 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:38.763 05:07:09 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:38.763 05:07:09 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:38.763 05:07:09 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:07:38.763 05:07:09 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:38.763 05:07:09 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:38.763 05:07:09 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:38.763 05:07:09 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:38.763 05:07:09 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:38.763 05:07:09 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:38.763 05:07:09 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:38.763 05:07:09 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:38.763 05:07:09 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:38.763 05:07:09 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:38.763 05:07:09 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:38.763 05:07:09 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:38.763 05:07:09 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:38.763 05:07:09 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:38.763 05:07:09 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:38.763 05:07:09 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:38.763 05:07:09 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:38.763 05:07:09 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:38.763 05:07:09 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:38.763 05:07:09 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:38.763 05:07:09 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:38.763 05:07:09 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:38.763 05:07:09 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:38.763 05:07:09 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:38.763 05:07:09 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:38.763 05:07:09 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:38.763 05:07:09 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:38.763 05:07:09 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:38.763 05:07:09 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:38.763 05:07:09 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:38.763 05:07:09 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:38.763 05:07:09 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:38.763 05:07:09 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:38.763 05:07:09 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:38.763 #define SPDK_CONFIG_H 00:07:38.763 #define SPDK_CONFIG_APPS 1 00:07:38.763 #define SPDK_CONFIG_ARCH native 00:07:38.763 #undef SPDK_CONFIG_ASAN 00:07:38.763 #undef SPDK_CONFIG_AVAHI 00:07:38.763 #undef SPDK_CONFIG_CET 00:07:38.763 #define SPDK_CONFIG_COVERAGE 1 00:07:38.764 #define SPDK_CONFIG_CROSS_PREFIX 00:07:38.764 #undef SPDK_CONFIG_CRYPTO 00:07:38.764 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:38.764 #undef SPDK_CONFIG_CUSTOMOCF 00:07:38.764 #undef SPDK_CONFIG_DAOS 00:07:38.764 #define SPDK_CONFIG_DAOS_DIR 00:07:38.764 #define SPDK_CONFIG_DEBUG 1 00:07:38.764 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:38.764 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:38.764 #define SPDK_CONFIG_DPDK_INC_DIR 00:07:38.764 #define SPDK_CONFIG_DPDK_LIB_DIR 00:07:38.764 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:38.764 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:38.764 #define SPDK_CONFIG_EXAMPLES 1 00:07:38.764 #undef SPDK_CONFIG_FC 00:07:38.764 #define SPDK_CONFIG_FC_PATH 00:07:38.764 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:38.764 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:38.764 #undef SPDK_CONFIG_FUSE 00:07:38.764 #define SPDK_CONFIG_FUZZER 1 00:07:38.764 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:38.764 #undef SPDK_CONFIG_GOLANG 00:07:38.764 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:38.764 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:38.764 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:38.764 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:38.764 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:38.764 #define SPDK_CONFIG_IDXD 1 00:07:38.764 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:38.764 #undef SPDK_CONFIG_IPSEC_MB 00:07:38.764 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:38.764 #define SPDK_CONFIG_ISAL 1 00:07:38.764 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:38.764 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:38.764 #define SPDK_CONFIG_LIBDIR 00:07:38.764 #undef SPDK_CONFIG_LTO 00:07:38.764 #define SPDK_CONFIG_MAX_LCORES 00:07:38.764 #define SPDK_CONFIG_NVME_CUSE 1 00:07:38.764 #undef SPDK_CONFIG_OCF 00:07:38.764 #define SPDK_CONFIG_OCF_PATH 00:07:38.764 #define SPDK_CONFIG_OPENSSL_PATH 00:07:38.764 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:38.764 #undef SPDK_CONFIG_PGO_USE 00:07:38.764 #define SPDK_CONFIG_PREFIX /usr/local 00:07:38.764 #undef SPDK_CONFIG_RAID5F 00:07:38.764 #undef SPDK_CONFIG_RBD 00:07:38.764 #define SPDK_CONFIG_RDMA 1 00:07:38.764 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:38.764 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:38.764 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:38.764 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:38.764 #undef SPDK_CONFIG_SHARED 00:07:38.764 #undef SPDK_CONFIG_SMA 00:07:38.764 #define SPDK_CONFIG_TESTS 1 00:07:38.764 #undef SPDK_CONFIG_TSAN 00:07:38.764 #define SPDK_CONFIG_UBLK 1 00:07:38.764 #define SPDK_CONFIG_UBSAN 1 00:07:38.764 #undef SPDK_CONFIG_UNIT_TESTS 00:07:38.764 #undef SPDK_CONFIG_URING 00:07:38.764 #define SPDK_CONFIG_URING_PATH 00:07:38.764 #undef SPDK_CONFIG_URING_ZNS 00:07:38.764 #undef SPDK_CONFIG_USDT 00:07:38.764 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:38.764 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:38.764 #define SPDK_CONFIG_VFIO_USER 1 00:07:38.764 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:38.764 #define SPDK_CONFIG_VHOST 1 00:07:38.764 #define SPDK_CONFIG_VIRTIO 1 00:07:38.764 #undef SPDK_CONFIG_VTUNE 00:07:38.764 #define SPDK_CONFIG_VTUNE_DIR 00:07:38.764 #define SPDK_CONFIG_WERROR 1 00:07:38.764 #define SPDK_CONFIG_WPDK_DIR 00:07:38.764 #undef SPDK_CONFIG_XNVME 00:07:38.764 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:38.764 05:07:09 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:38.764 05:07:09 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:38.764 05:07:09 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:38.764 05:07:09 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:38.764 05:07:09 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:38.764 05:07:09 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:38.764 05:07:09 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:38.764 05:07:09 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:38.764 05:07:09 -- paths/export.sh@5 -- # export PATH 00:07:38.764 05:07:09 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:38.764 05:07:09 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:38.764 05:07:09 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:38.764 05:07:09 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:38.764 05:07:09 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:38.764 05:07:09 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:38.764 05:07:09 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:38.764 05:07:09 -- pm/common@16 -- # TEST_TAG=N/A 00:07:38.764 05:07:09 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:38.764 05:07:09 -- common/autotest_common.sh@52 -- # : 1 00:07:38.764 05:07:09 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:38.764 05:07:09 -- common/autotest_common.sh@56 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:38.764 05:07:09 -- common/autotest_common.sh@58 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:38.764 05:07:09 -- common/autotest_common.sh@60 -- # : 1 00:07:38.764 05:07:09 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:38.764 05:07:09 -- common/autotest_common.sh@62 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:38.764 05:07:09 -- common/autotest_common.sh@64 -- # : 00:07:38.764 05:07:09 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:38.764 05:07:09 -- common/autotest_common.sh@66 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:38.764 05:07:09 -- common/autotest_common.sh@68 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:38.764 05:07:09 -- common/autotest_common.sh@70 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:38.764 05:07:09 -- common/autotest_common.sh@72 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:38.764 05:07:09 -- common/autotest_common.sh@74 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:38.764 05:07:09 -- common/autotest_common.sh@76 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:38.764 05:07:09 -- common/autotest_common.sh@78 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:38.764 05:07:09 -- common/autotest_common.sh@80 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:38.764 05:07:09 -- common/autotest_common.sh@82 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:38.764 05:07:09 -- common/autotest_common.sh@84 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:38.764 05:07:09 -- common/autotest_common.sh@86 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:38.764 05:07:09 -- common/autotest_common.sh@88 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:38.764 05:07:09 -- common/autotest_common.sh@90 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:38.764 05:07:09 -- common/autotest_common.sh@92 -- # : 1 00:07:38.764 05:07:09 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:38.764 05:07:09 -- common/autotest_common.sh@94 -- # : 1 00:07:38.764 05:07:09 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:38.764 05:07:09 -- common/autotest_common.sh@96 -- # : rdma 00:07:38.764 05:07:09 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:38.764 05:07:09 -- common/autotest_common.sh@98 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:38.764 05:07:09 -- common/autotest_common.sh@100 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:38.764 05:07:09 -- common/autotest_common.sh@102 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:38.764 05:07:09 -- common/autotest_common.sh@104 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:38.764 05:07:09 -- common/autotest_common.sh@106 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:38.764 05:07:09 -- common/autotest_common.sh@108 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:38.764 05:07:09 -- common/autotest_common.sh@110 -- # : 0 00:07:38.764 05:07:09 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:38.764 05:07:09 -- common/autotest_common.sh@112 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:38.765 05:07:09 -- common/autotest_common.sh@114 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:38.765 05:07:09 -- common/autotest_common.sh@116 -- # : 1 00:07:38.765 05:07:09 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:38.765 05:07:09 -- common/autotest_common.sh@118 -- # : 00:07:38.765 05:07:09 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:38.765 05:07:09 -- common/autotest_common.sh@120 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:38.765 05:07:09 -- common/autotest_common.sh@122 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:38.765 05:07:09 -- common/autotest_common.sh@124 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:38.765 05:07:09 -- common/autotest_common.sh@126 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:38.765 05:07:09 -- common/autotest_common.sh@128 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:38.765 05:07:09 -- common/autotest_common.sh@130 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:38.765 05:07:09 -- common/autotest_common.sh@132 -- # : 00:07:38.765 05:07:09 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:38.765 05:07:09 -- common/autotest_common.sh@134 -- # : true 00:07:38.765 05:07:09 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:38.765 05:07:09 -- common/autotest_common.sh@136 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:38.765 05:07:09 -- common/autotest_common.sh@138 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:38.765 05:07:09 -- common/autotest_common.sh@140 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:38.765 05:07:09 -- common/autotest_common.sh@142 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:38.765 05:07:09 -- common/autotest_common.sh@144 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:38.765 05:07:09 -- common/autotest_common.sh@146 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:38.765 05:07:09 -- common/autotest_common.sh@148 -- # : 00:07:38.765 05:07:09 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:38.765 05:07:09 -- common/autotest_common.sh@150 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:38.765 05:07:09 -- common/autotest_common.sh@152 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:38.765 05:07:09 -- common/autotest_common.sh@154 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:38.765 05:07:09 -- common/autotest_common.sh@156 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:38.765 05:07:09 -- common/autotest_common.sh@158 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:38.765 05:07:09 -- common/autotest_common.sh@160 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:38.765 05:07:09 -- common/autotest_common.sh@163 -- # : 00:07:38.765 05:07:09 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:38.765 05:07:09 -- common/autotest_common.sh@165 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:38.765 05:07:09 -- common/autotest_common.sh@167 -- # : 0 00:07:38.765 05:07:09 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:38.765 05:07:09 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:38.765 05:07:09 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:38.765 05:07:09 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:38.765 05:07:09 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:38.765 05:07:09 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:38.765 05:07:09 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:38.765 05:07:09 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:38.765 05:07:09 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:38.765 05:07:09 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:38.765 05:07:09 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:38.765 05:07:09 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:38.765 05:07:09 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:38.765 05:07:09 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:38.765 05:07:09 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:38.765 05:07:09 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:38.765 05:07:09 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:38.765 05:07:09 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:38.765 05:07:09 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:38.765 05:07:09 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:38.765 05:07:09 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:38.765 05:07:09 -- common/autotest_common.sh@196 -- # cat 00:07:38.765 05:07:09 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:38.765 05:07:09 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:38.765 05:07:09 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:38.765 05:07:09 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:38.765 05:07:09 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:38.765 05:07:09 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:38.765 05:07:09 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:38.765 05:07:09 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:38.765 05:07:09 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:38.765 05:07:09 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:38.765 05:07:09 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:38.765 05:07:09 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:38.765 05:07:09 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:38.765 05:07:09 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:38.765 05:07:09 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:38.765 05:07:09 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:38.765 05:07:09 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:38.765 05:07:09 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:38.765 05:07:09 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:38.765 05:07:09 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:07:38.765 05:07:09 -- common/autotest_common.sh@249 -- # export valgrind= 00:07:38.765 05:07:09 -- common/autotest_common.sh@249 -- # valgrind= 00:07:38.765 05:07:09 -- common/autotest_common.sh@255 -- # uname -s 00:07:38.765 05:07:09 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:07:38.765 05:07:09 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:07:38.765 05:07:09 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:07:38.766 05:07:09 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:07:38.766 05:07:09 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:38.766 05:07:09 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:38.766 05:07:09 -- common/autotest_common.sh@265 -- # MAKE=make 00:07:38.766 05:07:09 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:07:38.766 05:07:09 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:07:38.766 05:07:09 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:07:38.766 05:07:09 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:38.766 05:07:09 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:07:38.766 05:07:09 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:07:38.766 05:07:09 -- common/autotest_common.sh@309 -- # [[ -z 3138717 ]] 00:07:38.766 05:07:09 -- common/autotest_common.sh@309 -- # kill -0 3138717 00:07:38.766 05:07:09 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:07:38.766 05:07:09 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:07:38.766 05:07:09 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:07:38.766 05:07:09 -- common/autotest_common.sh@322 -- # local mount target_dir 00:07:38.766 05:07:09 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:07:38.766 05:07:09 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:07:38.766 05:07:09 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:07:38.766 05:07:09 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:07:38.766 05:07:09 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.KMQvdU 00:07:38.766 05:07:09 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:38.766 05:07:09 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:07:38.766 05:07:09 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:07:38.766 05:07:09 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.KMQvdU/tests/nvmf /tmp/spdk.KMQvdU 00:07:38.766 05:07:09 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:07:38.766 05:07:09 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:38.766 05:07:09 -- common/autotest_common.sh@318 -- # df -T 00:07:38.766 05:07:09 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:07:38.766 05:07:09 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:07:38.766 05:07:09 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:07:38.766 05:07:09 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:07:38.766 05:07:09 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # avails["$mount"]=49165746176 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:07:38.766 05:07:09 -- common/autotest_common.sh@354 -- # uses["$mount"]=12576571392 00:07:38.766 05:07:09 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:38.766 05:07:09 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:07:38.766 05:07:09 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342489088 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:07:38.766 05:07:09 -- common/autotest_common.sh@354 -- # uses["$mount"]=5976064 00:07:38.766 05:07:09 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868627456 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:38.766 05:07:09 -- common/autotest_common.sh@354 -- # uses["$mount"]=2531328 00:07:38.766 05:07:09 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:38.766 05:07:09 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:07:38.766 05:07:09 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:07:38.766 05:07:09 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:07:38.766 05:07:09 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:38.766 05:07:09 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:07:38.766 * Looking for test storage... 00:07:38.766 05:07:09 -- common/autotest_common.sh@359 -- # local target_space new_size 00:07:38.766 05:07:09 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:07:38.766 05:07:09 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:38.766 05:07:09 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:38.766 05:07:09 -- common/autotest_common.sh@363 -- # mount=/ 00:07:38.766 05:07:09 -- common/autotest_common.sh@365 -- # target_space=49165746176 00:07:38.766 05:07:09 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:07:38.766 05:07:09 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:07:38.766 05:07:09 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:07:38.766 05:07:09 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:07:38.766 05:07:09 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:07:38.766 05:07:09 -- common/autotest_common.sh@372 -- # new_size=14791163904 00:07:38.766 05:07:09 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:38.766 05:07:09 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:38.766 05:07:09 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:38.766 05:07:09 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:38.766 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:38.766 05:07:09 -- common/autotest_common.sh@380 -- # return 0 00:07:38.766 05:07:09 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:07:38.766 05:07:09 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:07:38.766 05:07:09 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:38.766 05:07:09 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:38.766 05:07:09 -- common/autotest_common.sh@1672 -- # true 00:07:38.766 05:07:09 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:07:38.766 05:07:09 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:38.766 05:07:09 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:38.766 05:07:09 -- common/autotest_common.sh@27 -- # exec 00:07:38.766 05:07:09 -- common/autotest_common.sh@29 -- # exec 00:07:38.766 05:07:09 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:38.766 05:07:09 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:38.766 05:07:09 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:38.766 05:07:09 -- common/autotest_common.sh@18 -- # set -x 00:07:38.766 05:07:09 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:38.766 05:07:09 -- ../common.sh@8 -- # pids=() 00:07:38.766 05:07:09 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:38.766 05:07:09 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:38.766 05:07:09 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:38.766 05:07:09 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:38.766 05:07:09 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:38.766 05:07:09 -- nvmf/run.sh@61 -- # mem_size=512 00:07:38.766 05:07:09 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:38.766 05:07:09 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:38.766 05:07:09 -- ../common.sh@69 -- # local fuzz_num=25 00:07:38.766 05:07:09 -- ../common.sh@70 -- # local time=1 00:07:38.766 05:07:09 -- ../common.sh@72 -- # (( i = 0 )) 00:07:38.766 05:07:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:38.766 05:07:09 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:38.766 05:07:09 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:38.766 05:07:09 -- nvmf/run.sh@24 -- # local timen=1 00:07:38.766 05:07:09 -- nvmf/run.sh@25 -- # local core=0x1 00:07:38.766 05:07:09 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:38.766 05:07:09 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:38.766 05:07:09 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:38.766 05:07:09 -- nvmf/run.sh@29 -- # port=4400 00:07:38.766 05:07:09 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:38.766 05:07:09 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:38.766 05:07:09 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:38.766 05:07:09 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:39.026 [2024-07-23 05:07:09.856071] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:39.026 [2024-07-23 05:07:09.856166] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3138767 ] 00:07:39.026 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.026 [2024-07-23 05:07:10.083044] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.285 [2024-07-23 05:07:10.165990] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:39.285 [2024-07-23 05:07:10.166164] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.285 [2024-07-23 05:07:10.227154] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:39.285 [2024-07-23 05:07:10.243520] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:39.285 INFO: Running with entropic power schedule (0xFF, 100). 00:07:39.285 INFO: Seed: 2351796637 00:07:39.285 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:39.285 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:39.285 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:39.285 INFO: A corpus is not provided, starting from an empty corpus 00:07:39.285 #2 INITED exec/s: 0 rss: 60Mb 00:07:39.285 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:39.285 This may also happen if the target rejected all inputs we tried so far 00:07:39.285 [2024-07-23 05:07:10.298906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:39.285 [2024-07-23 05:07:10.298942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.853 NEW_FUNC[1/670]: 0x480d10 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:39.853 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:39.853 #9 NEW cov: 11472 ft: 11473 corp: 2/122b lim: 320 exec/s: 0 rss: 67Mb L: 121/121 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:39.853 [2024-07-23 05:07:10.729973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:39.853 [2024-07-23 05:07:10.730015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.853 #10 NEW cov: 11585 ft: 12001 corp: 3/243b lim: 320 exec/s: 0 rss: 67Mb L: 121/121 MS: 1 ChangeByte- 00:07:39.853 [2024-07-23 05:07:10.790062] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:39.853 [2024-07-23 05:07:10.790098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.853 #11 NEW cov: 11591 ft: 12283 corp: 4/364b lim: 320 exec/s: 0 rss: 67Mb L: 121/121 MS: 1 ChangeBit- 00:07:39.854 [2024-07-23 05:07:10.840308] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:39.854 [2024-07-23 05:07:10.840341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.854 [2024-07-23 05:07:10.840409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565613565656 00:07:39.854 [2024-07-23 05:07:10.840428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.854 #12 NEW cov: 11698 ft: 12779 corp: 5/527b lim: 320 exec/s: 0 rss: 67Mb L: 163/163 MS: 1 CrossOver- 00:07:39.854 [2024-07-23 05:07:10.900461] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:39.854 [2024-07-23 05:07:10.900494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.854 [2024-07-23 05:07:10.900566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565613565656 00:07:39.854 [2024-07-23 05:07:10.900585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.854 #13 NEW cov: 11698 ft: 12834 corp: 6/690b lim: 320 exec/s: 0 rss: 67Mb L: 163/163 MS: 1 ShuffleBytes- 00:07:40.112 [2024-07-23 05:07:10.950651] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.112 [2024-07-23 05:07:10.950684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.112 [2024-07-23 05:07:10.950754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656561356565656 00:07:40.112 [2024-07-23 05:07:10.950778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.112 #14 NEW cov: 11698 ft: 12924 corp: 7/854b lim: 320 exec/s: 0 rss: 68Mb L: 164/164 MS: 1 InsertByte- 00:07:40.112 [2024-07-23 05:07:10.990731] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.112 [2024-07-23 05:07:10.990764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.112 [2024-07-23 05:07:10.990831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656561356565656 00:07:40.112 [2024-07-23 05:07:10.990849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.113 #15 NEW cov: 11698 ft: 12960 corp: 8/1018b lim: 320 exec/s: 0 rss: 68Mb L: 164/164 MS: 1 InsertByte- 00:07:40.113 [2024-07-23 05:07:11.030751] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.113 [2024-07-23 05:07:11.030784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.113 #16 NEW cov: 11698 ft: 13007 corp: 9/1139b lim: 320 exec/s: 0 rss: 68Mb L: 121/164 MS: 1 CopyPart- 00:07:40.113 [2024-07-23 05:07:11.070756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:40.113 [2024-07-23 05:07:11.070788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.113 NEW_FUNC[1/1]: 0x12e0670 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2014 00:07:40.113 #20 NEW cov: 11729 ft: 13097 corp: 10/1253b lim: 320 exec/s: 0 rss: 68Mb L: 114/164 MS: 4 ChangeByte-InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:40.113 [2024-07-23 05:07:11.120952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:40.113 [2024-07-23 05:07:11.120984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.113 #21 NEW cov: 11729 ft: 13133 corp: 11/1367b lim: 320 exec/s: 0 rss: 68Mb L: 114/164 MS: 1 ChangeBit- 00:07:40.113 [2024-07-23 05:07:11.171355] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.113 [2024-07-23 05:07:11.171389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.113 [2024-07-23 05:07:11.171464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565613565656 00:07:40.113 [2024-07-23 05:07:11.171484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.113 [2024-07-23 05:07:11.171555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:6 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:40.113 [2024-07-23 05:07:11.171573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.113 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:40.113 #22 NEW cov: 11752 ft: 13362 corp: 12/1590b lim: 320 exec/s: 0 rss: 68Mb L: 223/223 MS: 1 InsertRepeatedBytes- 00:07:40.489 [2024-07-23 05:07:11.211233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:40.489 [2024-07-23 05:07:11.211271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.489 #23 NEW cov: 11752 ft: 13400 corp: 13/1704b lim: 320 exec/s: 0 rss: 68Mb L: 114/223 MS: 1 ChangeASCIIInt- 00:07:40.489 [2024-07-23 05:07:11.261484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.489 [2024-07-23 05:07:11.261517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.489 [2024-07-23 05:07:11.261588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565613565656 00:07:40.489 [2024-07-23 05:07:11.261607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.489 #34 NEW cov: 11752 ft: 13405 corp: 14/1867b lim: 320 exec/s: 34 rss: 68Mb L: 163/223 MS: 1 CMP- DE: "y\012"- 00:07:40.489 [2024-07-23 05:07:11.321529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.489 [2024-07-23 05:07:11.321562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.489 #35 NEW cov: 11752 ft: 13475 corp: 15/1979b lim: 320 exec/s: 35 rss: 68Mb L: 112/223 MS: 1 EraseBytes- 00:07:40.489 [2024-07-23 05:07:11.361636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:4 nsid:56545656 cdw10:56565656 cdw11:56561356 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.489 [2024-07-23 05:07:11.361668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.489 #36 NEW cov: 11752 ft: 13525 corp: 16/2046b lim: 320 exec/s: 36 rss: 68Mb L: 67/223 MS: 1 CrossOver- 00:07:40.489 [2024-07-23 05:07:11.401902] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.489 [2024-07-23 05:07:11.401934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.489 [2024-07-23 05:07:11.402002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565654 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.489 [2024-07-23 05:07:11.402021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.489 #37 NEW cov: 11752 ft: 13563 corp: 17/2202b lim: 320 exec/s: 37 rss: 68Mb L: 156/223 MS: 1 CrossOver- 00:07:40.489 [2024-07-23 05:07:11.441893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.489 [2024-07-23 05:07:11.441926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.489 #38 NEW cov: 11752 ft: 13616 corp: 18/2323b lim: 320 exec/s: 38 rss: 68Mb L: 121/223 MS: 1 ChangeByte- 00:07:40.489 [2024-07-23 05:07:11.491950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:40.489 [2024-07-23 05:07:11.491982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.489 #39 NEW cov: 11752 ft: 13642 corp: 19/2438b lim: 320 exec/s: 39 rss: 68Mb L: 115/223 MS: 1 InsertByte- 00:07:40.489 [2024-07-23 05:07:11.532272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.489 [2024-07-23 05:07:11.532307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.490 [2024-07-23 05:07:11.532379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:560a5656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.490 [2024-07-23 05:07:11.532398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.490 #40 NEW cov: 11752 ft: 13646 corp: 20/2594b lim: 320 exec/s: 40 rss: 68Mb L: 156/223 MS: 1 CopyPart- 00:07:40.748 [2024-07-23 05:07:11.582320] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.748 [2024-07-23 05:07:11.582353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.748 #41 NEW cov: 11752 ft: 13662 corp: 21/2715b lim: 320 exec/s: 41 rss: 68Mb L: 121/223 MS: 1 CopyPart- 00:07:40.749 [2024-07-23 05:07:11.622427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x56565656565656d6 00:07:40.749 [2024-07-23 05:07:11.622463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.749 #42 NEW cov: 11752 ft: 13670 corp: 22/2827b lim: 320 exec/s: 42 rss: 68Mb L: 112/223 MS: 1 ChangeBit- 00:07:40.749 [2024-07-23 05:07:11.672497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:40.749 [2024-07-23 05:07:11.672529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.749 #43 NEW cov: 11752 ft: 13683 corp: 23/2952b lim: 320 exec/s: 43 rss: 69Mb L: 125/223 MS: 1 CrossOver- 00:07:40.749 [2024-07-23 05:07:11.722682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:40.749 [2024-07-23 05:07:11.722717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.749 #44 NEW cov: 11752 ft: 13691 corp: 24/3067b lim: 320 exec/s: 44 rss: 69Mb L: 115/223 MS: 1 ChangeBinInt- 00:07:40.749 [2024-07-23 05:07:11.762915] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.749 [2024-07-23 05:07:11.762947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.749 [2024-07-23 05:07:11.763014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.749 [2024-07-23 05:07:11.763033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.749 #45 NEW cov: 11752 ft: 13700 corp: 25/3225b lim: 320 exec/s: 45 rss: 69Mb L: 158/223 MS: 1 EraseBytes- 00:07:40.749 [2024-07-23 05:07:11.813107] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.749 [2024-07-23 05:07:11.813138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.749 [2024-07-23 05:07:11.813208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565654 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:40.749 [2024-07-23 05:07:11.813227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.071 #46 NEW cov: 11752 ft: 13712 corp: 26/3381b lim: 320 exec/s: 46 rss: 69Mb L: 156/223 MS: 1 CopyPart- 00:07:41.071 [2024-07-23 05:07:11.853257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:41.071 [2024-07-23 05:07:11.853292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.071 [2024-07-23 05:07:11.853363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:41.071 [2024-07-23 05:07:11.853381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.072 [2024-07-23 05:07:11.853457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:41.072 [2024-07-23 05:07:11.853478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.072 #47 NEW cov: 11752 ft: 13773 corp: 27/3579b lim: 320 exec/s: 47 rss: 69Mb L: 198/223 MS: 1 CrossOver- 00:07:41.072 [2024-07-23 05:07:11.903194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xfffffbffffffffff 00:07:41.072 [2024-07-23 05:07:11.903225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.072 #48 NEW cov: 11752 ft: 13870 corp: 28/3693b lim: 320 exec/s: 48 rss: 69Mb L: 114/223 MS: 1 CopyPart- 00:07:41.072 [2024-07-23 05:07:11.943495] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:41.072 [2024-07-23 05:07:11.943526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.072 [2024-07-23 05:07:11.943599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565613565656 00:07:41.072 [2024-07-23 05:07:11.943618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.072 #49 NEW cov: 11752 ft: 13882 corp: 29/3856b lim: 320 exec/s: 49 rss: 69Mb L: 163/223 MS: 1 ChangeBinInt- 00:07:41.072 [2024-07-23 05:07:11.983403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xfffffbffffffffff 00:07:41.072 [2024-07-23 05:07:11.983436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.072 #50 NEW cov: 11752 ft: 13889 corp: 30/3970b lim: 320 exec/s: 50 rss: 69Mb L: 114/223 MS: 1 ChangeBit- 00:07:41.072 [2024-07-23 05:07:12.033619] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:41.072 [2024-07-23 05:07:12.033650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.072 #51 NEW cov: 11752 ft: 13900 corp: 31/4091b lim: 320 exec/s: 51 rss: 69Mb L: 121/223 MS: 1 ChangeBit- 00:07:41.072 [2024-07-23 05:07:12.073712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:4 nsid:56545656 cdw10:56565656 cdw11:56561356 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:41.072 [2024-07-23 05:07:12.073744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.072 #52 NEW cov: 11752 ft: 13953 corp: 32/4158b lim: 320 exec/s: 52 rss: 69Mb L: 67/223 MS: 1 ChangeBit- 00:07:41.072 [2024-07-23 05:07:12.123982] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:41.072 [2024-07-23 05:07:12.124014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.072 [2024-07-23 05:07:12.124087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565654 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:41.072 [2024-07-23 05:07:12.124106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.331 #53 NEW cov: 11752 ft: 13965 corp: 33/4314b lim: 320 exec/s: 53 rss: 69Mb L: 156/223 MS: 1 ChangeByte- 00:07:41.331 [2024-07-23 05:07:12.174194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:41.331 [2024-07-23 05:07:12.174226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.332 [2024-07-23 05:07:12.174297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:41.332 [2024-07-23 05:07:12.174316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.332 [2024-07-23 05:07:12.174384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:41.332 [2024-07-23 05:07:12.174403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.332 #54 NEW cov: 11752 ft: 13970 corp: 34/4512b lim: 320 exec/s: 54 rss: 69Mb L: 198/223 MS: 1 ChangeASCIIInt- 00:07:41.332 [2024-07-23 05:07:12.224268] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656565656565656 00:07:41.332 [2024-07-23 05:07:12.224301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.332 [2024-07-23 05:07:12.224378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (56) qid:0 cid:5 nsid:56565656 cdw10:56565656 cdw11:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5656561356565656 00:07:41.332 [2024-07-23 05:07:12.224396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.332 #55 NEW cov: 11752 ft: 13979 corp: 35/4676b lim: 320 exec/s: 55 rss: 69Mb L: 164/223 MS: 1 ChangeBinInt- 00:07:41.332 [2024-07-23 05:07:12.274262] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:56565656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x56565656565656d6 00:07:41.332 [2024-07-23 05:07:12.274294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.332 #56 NEW cov: 11752 ft: 13986 corp: 36/4788b lim: 320 exec/s: 28 rss: 70Mb L: 112/223 MS: 1 ChangeBit- 00:07:41.332 #56 DONE cov: 11752 ft: 13986 corp: 36/4788b lim: 320 exec/s: 28 rss: 70Mb 00:07:41.332 ###### Recommended dictionary. ###### 00:07:41.332 "y\012" # Uses: 0 00:07:41.332 ###### End of recommended dictionary. ###### 00:07:41.332 Done 56 runs in 2 second(s) 00:07:41.591 05:07:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:41.591 05:07:12 -- ../common.sh@72 -- # (( i++ )) 00:07:41.591 05:07:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:41.591 05:07:12 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:41.591 05:07:12 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:41.591 05:07:12 -- nvmf/run.sh@24 -- # local timen=1 00:07:41.591 05:07:12 -- nvmf/run.sh@25 -- # local core=0x1 00:07:41.591 05:07:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:41.591 05:07:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:41.591 05:07:12 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:41.591 05:07:12 -- nvmf/run.sh@29 -- # port=4401 00:07:41.591 05:07:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:41.591 05:07:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:41.591 05:07:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:41.591 05:07:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:41.591 [2024-07-23 05:07:12.493391] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:41.591 [2024-07-23 05:07:12.493471] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3139195 ] 00:07:41.591 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.850 [2024-07-23 05:07:12.711367] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.850 [2024-07-23 05:07:12.787340] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:41.851 [2024-07-23 05:07:12.787519] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.851 [2024-07-23 05:07:12.848461] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:41.851 [2024-07-23 05:07:12.864805] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:41.851 INFO: Running with entropic power schedule (0xFF, 100). 00:07:41.851 INFO: Seed: 675826309 00:07:41.851 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:41.851 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:41.851 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:41.851 INFO: A corpus is not provided, starting from an empty corpus 00:07:41.851 #2 INITED exec/s: 0 rss: 60Mb 00:07:41.851 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:41.851 This may also happen if the target rejected all inputs we tried so far 00:07:41.851 [2024-07-23 05:07:12.913947] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.851 [2024-07-23 05:07:12.914079] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.851 [2024-07-23 05:07:12.914201] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.851 [2024-07-23 05:07:12.914321] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.851 [2024-07-23 05:07:12.914448] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.851 [2024-07-23 05:07:12.914695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.851 [2024-07-23 05:07:12.914732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.851 [2024-07-23 05:07:12.914796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.851 [2024-07-23 05:07:12.914814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.851 [2024-07-23 05:07:12.914874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.851 [2024-07-23 05:07:12.914892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.851 [2024-07-23 05:07:12.914952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.851 [2024-07-23 05:07:12.914971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.851 [2024-07-23 05:07:12.915037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.851 [2024-07-23 05:07:12.915055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.045 NEW_FUNC[1/671]: 0x481610 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:43.045 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:43.045 #7 NEW cov: 11553 ft: 11554 corp: 2/31b lim: 30 exec/s: 0 rss: 67Mb L: 30/30 MS: 5 InsertRepeatedBytes-EraseBytes-EraseBytes-EraseBytes-InsertRepeatedBytes- 00:07:43.045 [2024-07-23 05:07:13.876312] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.045 [2024-07-23 05:07:13.876466] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff56 00:07:43.045 [2024-07-23 05:07:13.876705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.045 [2024-07-23 05:07:13.876746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.045 [2024-07-23 05:07:13.876810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.045 [2024-07-23 05:07:13.876829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.045 #12 NEW cov: 11666 ft: 12659 corp: 3/44b lim: 30 exec/s: 12 rss: 67Mb L: 13/30 MS: 5 ChangeByte-ChangeBit-ChangeBit-CopyPart-InsertRepeatedBytes- 00:07:43.045 [2024-07-23 05:07:13.926438] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.045 [2024-07-23 05:07:13.926578] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.045 [2024-07-23 05:07:13.926704] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.046 [2024-07-23 05:07:13.926826] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.046 [2024-07-23 05:07:13.926953] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.046 [2024-07-23 05:07:13.927221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.046 [2024-07-23 05:07:13.927256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.046 [2024-07-23 05:07:13.927323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.046 [2024-07-23 05:07:13.927343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.046 [2024-07-23 05:07:13.927406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.046 [2024-07-23 05:07:13.927425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.046 [2024-07-23 05:07:13.927493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.046 [2024-07-23 05:07:13.927512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.046 [2024-07-23 05:07:13.927575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.046 [2024-07-23 05:07:13.927594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.046 #13 NEW cov: 11672 ft: 12989 corp: 4/74b lim: 30 exec/s: 13 rss: 67Mb L: 30/30 MS: 1 CopyPart- 00:07:43.046 [2024-07-23 05:07:13.986529] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.046 [2024-07-23 05:07:13.986660] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.046 [2024-07-23 05:07:13.986892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.046 [2024-07-23 05:07:13.986924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.046 [2024-07-23 05:07:13.986987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.046 [2024-07-23 05:07:13.987006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.046 #14 NEW cov: 11757 ft: 13372 corp: 5/91b lim: 30 exec/s: 14 rss: 67Mb L: 17/30 MS: 1 EraseBytes- 00:07:43.046 [2024-07-23 05:07:14.046659] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.046 [2024-07-23 05:07:14.046893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.046 [2024-07-23 05:07:14.046925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.046 #17 NEW cov: 11757 ft: 13793 corp: 6/99b lim: 30 exec/s: 17 rss: 67Mb L: 8/30 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:43.046 [2024-07-23 05:07:14.096806] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1 00:07:43.046 [2024-07-23 05:07:14.097037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff0001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.046 [2024-07-23 05:07:14.097069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.046 #18 NEW cov: 11757 ft: 13889 corp: 7/107b lim: 30 exec/s: 18 rss: 67Mb L: 8/30 MS: 1 ChangeBinInt- 00:07:43.305 [2024-07-23 05:07:14.156934] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004d4d 00:07:43.305 [2024-07-23 05:07:14.157188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:c80a814d cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.305 [2024-07-23 05:07:14.157220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.305 #20 NEW cov: 11757 ft: 13966 corp: 8/118b lim: 30 exec/s: 20 rss: 67Mb L: 11/30 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:43.305 [2024-07-23 05:07:14.197083] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x101 00:07:43.305 [2024-07-23 05:07:14.197323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.305 [2024-07-23 05:07:14.197355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.305 #21 NEW cov: 11757 ft: 14050 corp: 9/126b lim: 30 exec/s: 21 rss: 68Mb L: 8/30 MS: 1 CopyPart- 00:07:43.305 [2024-07-23 05:07:14.257308] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000aff 00:07:43.305 [2024-07-23 05:07:14.257435] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.305 [2024-07-23 05:07:14.257563] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.305 [2024-07-23 05:07:14.257683] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.305 [2024-07-23 05:07:14.257932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.305 [2024-07-23 05:07:14.257968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.305 [2024-07-23 05:07:14.258032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.305 [2024-07-23 05:07:14.258051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.306 [2024-07-23 05:07:14.258114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.306 [2024-07-23 05:07:14.258132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.306 [2024-07-23 05:07:14.258195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.306 [2024-07-23 05:07:14.258213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.306 #22 NEW cov: 11757 ft: 14138 corp: 10/150b lim: 30 exec/s: 22 rss: 68Mb L: 24/30 MS: 1 CrossOver- 00:07:43.306 [2024-07-23 05:07:14.307389] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff56 00:07:43.306 [2024-07-23 05:07:14.307630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.306 [2024-07-23 05:07:14.307662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.306 #23 NEW cov: 11757 ft: 14207 corp: 11/157b lim: 30 exec/s: 23 rss: 68Mb L: 7/30 MS: 1 EraseBytes- 00:07:43.306 [2024-07-23 05:07:14.367528] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x101 00:07:43.306 [2024-07-23 05:07:14.367763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:c7010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.306 [2024-07-23 05:07:14.367795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.565 #24 NEW cov: 11757 ft: 14216 corp: 12/165b lim: 30 exec/s: 24 rss: 68Mb L: 8/30 MS: 1 ChangeByte- 00:07:43.565 [2024-07-23 05:07:14.427791] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:43.565 [2024-07-23 05:07:14.428025] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (787456) > buf size (4096) 00:07:43.565 [2024-07-23 05:07:14.428260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.565 [2024-07-23 05:07:14.428292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.565 [2024-07-23 05:07:14.428358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.565 [2024-07-23 05:07:14.428377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.565 [2024-07-23 05:07:14.428446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.565 [2024-07-23 05:07:14.428465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.565 #25 NEW cov: 11797 ft: 14539 corp: 13/183b lim: 30 exec/s: 25 rss: 68Mb L: 18/30 MS: 1 InsertRepeatedBytes- 00:07:43.565 [2024-07-23 05:07:14.477901] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.565 [2024-07-23 05:07:14.478026] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.565 [2024-07-23 05:07:14.478142] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.566 [2024-07-23 05:07:14.478406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.566 [2024-07-23 05:07:14.478438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.566 [2024-07-23 05:07:14.478512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.566 [2024-07-23 05:07:14.478531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.566 [2024-07-23 05:07:14.478594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.566 [2024-07-23 05:07:14.478612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.566 #31 NEW cov: 11797 ft: 14593 corp: 14/204b lim: 30 exec/s: 31 rss: 68Mb L: 21/30 MS: 1 CrossOver- 00:07:43.566 [2024-07-23 05:07:14.537983] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1 00:07:43.566 [2024-07-23 05:07:14.538219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff0001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.566 [2024-07-23 05:07:14.538250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.566 #32 NEW cov: 11797 ft: 14601 corp: 15/212b lim: 30 exec/s: 32 rss: 68Mb L: 8/30 MS: 1 CopyPart- 00:07:43.566 [2024-07-23 05:07:14.578101] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:43.566 [2024-07-23 05:07:14.578226] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.566 [2024-07-23 05:07:14.578461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.566 [2024-07-23 05:07:14.578493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.566 [2024-07-23 05:07:14.578558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.566 [2024-07-23 05:07:14.578577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.566 #33 NEW cov: 11797 ft: 14649 corp: 16/226b lim: 30 exec/s: 33 rss: 68Mb L: 14/30 MS: 1 InsertByte- 00:07:43.566 [2024-07-23 05:07:14.628260] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b2b2 00:07:43.566 [2024-07-23 05:07:14.628395] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004d4d 00:07:43.566 [2024-07-23 05:07:14.628639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:c80a024d cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.566 [2024-07-23 05:07:14.628671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.566 [2024-07-23 05:07:14.628739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b2b281b2 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.566 [2024-07-23 05:07:14.628759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.825 #34 NEW cov: 11797 ft: 14728 corp: 17/243b lim: 30 exec/s: 34 rss: 68Mb L: 17/30 MS: 1 InsertRepeatedBytes- 00:07:43.825 [2024-07-23 05:07:14.688458] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.825 [2024-07-23 05:07:14.688592] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.825 [2024-07-23 05:07:14.688715] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.825 [2024-07-23 05:07:14.688949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.825 [2024-07-23 05:07:14.688980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.825 [2024-07-23 05:07:14.689043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.825 [2024-07-23 05:07:14.689062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.826 [2024-07-23 05:07:14.689127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-07-23 05:07:14.689146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.826 #35 NEW cov: 11797 ft: 14768 corp: 18/262b lim: 30 exec/s: 35 rss: 68Mb L: 19/30 MS: 1 CrossOver- 00:07:43.826 [2024-07-23 05:07:14.738605] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.826 [2024-07-23 05:07:14.738736] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.826 [2024-07-23 05:07:14.738856] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.826 [2024-07-23 05:07:14.739086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-07-23 05:07:14.739118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.826 [2024-07-23 05:07:14.739185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-07-23 05:07:14.739204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.826 [2024-07-23 05:07:14.739266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-07-23 05:07:14.739284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.826 #36 NEW cov: 11797 ft: 14792 corp: 19/284b lim: 30 exec/s: 36 rss: 68Mb L: 22/30 MS: 1 EraseBytes- 00:07:43.826 [2024-07-23 05:07:14.798751] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.826 [2024-07-23 05:07:14.798875] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.826 [2024-07-23 05:07:14.798990] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.826 [2024-07-23 05:07:14.799220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff830e cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-07-23 05:07:14.799251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.826 [2024-07-23 05:07:14.799317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-07-23 05:07:14.799336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.826 [2024-07-23 05:07:14.799399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-07-23 05:07:14.799417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.826 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.826 #37 NEW cov: 11820 ft: 14829 corp: 20/306b lim: 30 exec/s: 37 rss: 69Mb L: 22/30 MS: 1 ChangeBinInt- 00:07:43.826 [2024-07-23 05:07:14.858933] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.826 [2024-07-23 05:07:14.859067] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.826 [2024-07-23 05:07:14.859302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-07-23 05:07:14.859334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.826 [2024-07-23 05:07:14.859398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-07-23 05:07:14.859417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.826 #38 NEW cov: 11820 ft: 14839 corp: 21/323b lim: 30 exec/s: 19 rss: 69Mb L: 17/30 MS: 1 EraseBytes- 00:07:43.826 #38 DONE cov: 11820 ft: 14839 corp: 21/323b lim: 30 exec/s: 19 rss: 69Mb 00:07:43.826 Done 38 runs in 2 second(s) 00:07:44.085 05:07:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:44.085 05:07:15 -- ../common.sh@72 -- # (( i++ )) 00:07:44.085 05:07:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:44.085 05:07:15 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:44.085 05:07:15 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:44.085 05:07:15 -- nvmf/run.sh@24 -- # local timen=1 00:07:44.085 05:07:15 -- nvmf/run.sh@25 -- # local core=0x1 00:07:44.085 05:07:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:44.085 05:07:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:44.085 05:07:15 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:44.085 05:07:15 -- nvmf/run.sh@29 -- # port=4402 00:07:44.085 05:07:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:44.085 05:07:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:44.085 05:07:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:44.085 05:07:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:44.085 [2024-07-23 05:07:15.077540] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:44.085 [2024-07-23 05:07:15.077619] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3139614 ] 00:07:44.085 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.344 [2024-07-23 05:07:15.299184] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.345 [2024-07-23 05:07:15.375332] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:44.345 [2024-07-23 05:07:15.375520] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.345 [2024-07-23 05:07:15.436667] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:44.604 [2024-07-23 05:07:15.453014] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:44.604 INFO: Running with entropic power schedule (0xFF, 100). 00:07:44.604 INFO: Seed: 3265832347 00:07:44.604 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:44.604 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:44.604 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:44.604 INFO: A corpus is not provided, starting from an empty corpus 00:07:44.604 #2 INITED exec/s: 0 rss: 60Mb 00:07:44.604 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:44.604 This may also happen if the target rejected all inputs we tried so far 00:07:44.604 [2024-07-23 05:07:15.529602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-07-23 05:07:15.529648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.604 [2024-07-23 05:07:15.529779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-07-23 05:07:15.529803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.863 NEW_FUNC[1/670]: 0x484030 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:44.863 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:44.863 #6 NEW cov: 11511 ft: 11512 corp: 2/19b lim: 35 exec/s: 0 rss: 67Mb L: 18/18 MS: 4 CrossOver-ChangeBit-EraseBytes-InsertRepeatedBytes- 00:07:45.122 [2024-07-23 05:07:15.980766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.122 [2024-07-23 05:07:15.980815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.122 [2024-07-23 05:07:15.980948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.122 [2024-07-23 05:07:15.980970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.122 #12 NEW cov: 11624 ft: 12207 corp: 3/37b lim: 35 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 ChangeBinInt- 00:07:45.122 [2024-07-23 05:07:16.050848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28280028 cdw11:2800283f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.122 [2024-07-23 05:07:16.050887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.122 [2024-07-23 05:07:16.051013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.122 [2024-07-23 05:07:16.051038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.122 #13 NEW cov: 11630 ft: 12386 corp: 4/55b lim: 35 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 ChangeByte- 00:07:45.122 [2024-07-23 05:07:16.110990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28280028 cdw11:2800283f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.122 [2024-07-23 05:07:16.111026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.122 [2024-07-23 05:07:16.111158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.122 [2024-07-23 05:07:16.111180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.122 #14 NEW cov: 11715 ft: 12614 corp: 5/73b lim: 35 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 ShuffleBytes- 00:07:45.122 [2024-07-23 05:07:16.181155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28750028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.122 [2024-07-23 05:07:16.181190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.122 [2024-07-23 05:07:16.181315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.122 [2024-07-23 05:07:16.181342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.381 #15 NEW cov: 11715 ft: 12670 corp: 6/92b lim: 35 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 InsertByte- 00:07:45.381 [2024-07-23 05:07:16.251377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:283f0028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.381 [2024-07-23 05:07:16.251412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.381 [2024-07-23 05:07:16.251541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.382 [2024-07-23 05:07:16.251565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.382 #16 NEW cov: 11715 ft: 12730 corp: 7/110b lim: 35 exec/s: 0 rss: 67Mb L: 18/19 MS: 1 ChangeByte- 00:07:45.382 [2024-07-23 05:07:16.301627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28750028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.382 [2024-07-23 05:07:16.301662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.382 [2024-07-23 05:07:16.301794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.382 [2024-07-23 05:07:16.301816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.382 #17 NEW cov: 11715 ft: 12832 corp: 8/124b lim: 35 exec/s: 0 rss: 68Mb L: 14/19 MS: 1 EraseBytes- 00:07:45.382 [2024-07-23 05:07:16.372062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28750028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.382 [2024-07-23 05:07:16.372100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.382 [2024-07-23 05:07:16.372235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.382 [2024-07-23 05:07:16.372258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.382 [2024-07-23 05:07:16.372402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.382 [2024-07-23 05:07:16.372425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.382 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:45.382 #18 NEW cov: 11738 ft: 13109 corp: 9/147b lim: 35 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 CopyPart- 00:07:45.382 [2024-07-23 05:07:16.431653] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.382 [2024-07-23 05:07:16.431846] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.382 [2024-07-23 05:07:16.432021] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.382 [2024-07-23 05:07:16.432192] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.382 [2024-07-23 05:07:16.432561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.382 [2024-07-23 05:07:16.432602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.382 [2024-07-23 05:07:16.432745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.382 [2024-07-23 05:07:16.432780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.382 [2024-07-23 05:07:16.432915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.382 [2024-07-23 05:07:16.432941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.382 [2024-07-23 05:07:16.433078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.382 [2024-07-23 05:07:16.433101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.382 #20 NEW cov: 11747 ft: 13739 corp: 10/175b lim: 35 exec/s: 0 rss: 68Mb L: 28/28 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:45.641 [2024-07-23 05:07:16.491947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28750028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.641 [2024-07-23 05:07:16.491982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.641 #21 NEW cov: 11747 ft: 14186 corp: 11/183b lim: 35 exec/s: 21 rss: 68Mb L: 8/28 MS: 1 EraseBytes- 00:07:45.641 [2024-07-23 05:07:16.562067] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.641 [2024-07-23 05:07:16.562248] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.641 [2024-07-23 05:07:16.562419] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.641 [2024-07-23 05:07:16.562602] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.641 [2024-07-23 05:07:16.562981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.641 [2024-07-23 05:07:16.563026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.641 [2024-07-23 05:07:16.563166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.641 [2024-07-23 05:07:16.563197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.641 [2024-07-23 05:07:16.563331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.641 [2024-07-23 05:07:16.563361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.641 [2024-07-23 05:07:16.563501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.641 [2024-07-23 05:07:16.563529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.641 #22 NEW cov: 11747 ft: 14195 corp: 12/211b lim: 35 exec/s: 22 rss: 68Mb L: 28/28 MS: 1 ShuffleBytes- 00:07:45.641 [2024-07-23 05:07:16.632390] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.641 [2024-07-23 05:07:16.632566] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.641 [2024-07-23 05:07:16.632742] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.641 [2024-07-23 05:07:16.633298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.641 [2024-07-23 05:07:16.633341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.641 [2024-07-23 05:07:16.633478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.641 [2024-07-23 05:07:16.633507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.641 [2024-07-23 05:07:16.633640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ff0000fd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.641 [2024-07-23 05:07:16.633670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.641 [2024-07-23 05:07:16.633799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.641 [2024-07-23 05:07:16.633822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.641 #23 NEW cov: 11747 ft: 14231 corp: 13/239b lim: 35 exec/s: 23 rss: 68Mb L: 28/28 MS: 1 ChangeBinInt- 00:07:45.641 [2024-07-23 05:07:16.702601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28750028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.641 [2024-07-23 05:07:16.702636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.641 #24 NEW cov: 11747 ft: 14256 corp: 14/249b lim: 35 exec/s: 24 rss: 68Mb L: 10/28 MS: 1 EraseBytes- 00:07:45.900 [2024-07-23 05:07:16.762685] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.900 [2024-07-23 05:07:16.762863] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.900 [2024-07-23 05:07:16.763038] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.900 [2024-07-23 05:07:16.763206] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.900 [2024-07-23 05:07:16.763595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.900 [2024-07-23 05:07:16.763639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.900 [2024-07-23 05:07:16.763788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.900 [2024-07-23 05:07:16.763818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.900 [2024-07-23 05:07:16.763936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.900 [2024-07-23 05:07:16.763968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.900 [2024-07-23 05:07:16.764112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.900 [2024-07-23 05:07:16.764134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.900 #25 NEW cov: 11747 ft: 14284 corp: 15/277b lim: 35 exec/s: 25 rss: 68Mb L: 28/28 MS: 1 ShuffleBytes- 00:07:45.900 [2024-07-23 05:07:16.823518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28750028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.900 [2024-07-23 05:07:16.823553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.900 [2024-07-23 05:07:16.823689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.900 [2024-07-23 05:07:16.823710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.900 [2024-07-23 05:07:16.823841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.900 [2024-07-23 05:07:16.823866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.900 #26 NEW cov: 11747 ft: 14304 corp: 16/300b lim: 35 exec/s: 26 rss: 68Mb L: 23/28 MS: 1 ChangeByte- 00:07:45.900 [2024-07-23 05:07:16.893121] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.900 [2024-07-23 05:07:16.893308] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.900 [2024-07-23 05:07:16.893490] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.900 [2024-07-23 05:07:16.893655] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.901 [2024-07-23 05:07:16.894048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.901 [2024-07-23 05:07:16.894089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.901 [2024-07-23 05:07:16.894223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:10000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.901 [2024-07-23 05:07:16.894256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.901 [2024-07-23 05:07:16.894393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.901 [2024-07-23 05:07:16.894425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.901 [2024-07-23 05:07:16.894556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.901 [2024-07-23 05:07:16.894586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.901 #27 NEW cov: 11747 ft: 14323 corp: 17/328b lim: 35 exec/s: 27 rss: 68Mb L: 28/28 MS: 1 ChangeBit- 00:07:45.901 [2024-07-23 05:07:16.953551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.901 [2024-07-23 05:07:16.953587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.901 [2024-07-23 05:07:16.953721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.901 [2024-07-23 05:07:16.953744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.901 #28 NEW cov: 11747 ft: 14363 corp: 18/346b lim: 35 exec/s: 28 rss: 68Mb L: 18/28 MS: 1 ChangeBit- 00:07:46.160 [2024-07-23 05:07:17.003765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28750028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.160 [2024-07-23 05:07:17.003801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.160 [2024-07-23 05:07:17.003943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002f28 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.160 [2024-07-23 05:07:17.003965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.160 #29 NEW cov: 11747 ft: 14378 corp: 19/366b lim: 35 exec/s: 29 rss: 68Mb L: 20/28 MS: 1 InsertByte- 00:07:46.160 [2024-07-23 05:07:17.053592] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.160 [2024-07-23 05:07:17.053785] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.160 [2024-07-23 05:07:17.053963] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.160 [2024-07-23 05:07:17.054128] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.160 [2024-07-23 05:07:17.054521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:1c000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.160 [2024-07-23 05:07:17.054561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.160 [2024-07-23 05:07:17.054691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.160 [2024-07-23 05:07:17.054721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.160 [2024-07-23 05:07:17.054847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.160 [2024-07-23 05:07:17.054875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.160 [2024-07-23 05:07:17.055011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.160 [2024-07-23 05:07:17.055039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.160 #30 NEW cov: 11747 ft: 14453 corp: 20/394b lim: 35 exec/s: 30 rss: 68Mb L: 28/28 MS: 1 ChangeBinInt- 00:07:46.160 [2024-07-23 05:07:17.124385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28750028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.160 [2024-07-23 05:07:17.124420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.160 [2024-07-23 05:07:17.124556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.160 [2024-07-23 05:07:17.124580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.160 [2024-07-23 05:07:17.124703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:28280028 cdw11:280028b2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.160 [2024-07-23 05:07:17.124725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.160 #31 NEW cov: 11747 ft: 14490 corp: 21/418b lim: 35 exec/s: 31 rss: 69Mb L: 24/28 MS: 1 InsertByte- 00:07:46.160 [2024-07-23 05:07:17.193977] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.160 [2024-07-23 05:07:17.194153] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.161 [2024-07-23 05:07:17.194326] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.161 [2024-07-23 05:07:17.194509] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.161 [2024-07-23 05:07:17.194909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.161 [2024-07-23 05:07:17.194955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.161 [2024-07-23 05:07:17.195099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.161 [2024-07-23 05:07:17.195129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.161 [2024-07-23 05:07:17.195259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.161 [2024-07-23 05:07:17.195290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.161 [2024-07-23 05:07:17.195424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.161 [2024-07-23 05:07:17.195449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.161 #32 NEW cov: 11747 ft: 14497 corp: 22/446b lim: 35 exec/s: 32 rss: 69Mb L: 28/28 MS: 1 ShuffleBytes- 00:07:46.419 [2024-07-23 05:07:17.254982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:283f0028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.419 [2024-07-23 05:07:17.255017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.419 [2024-07-23 05:07:17.255124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a0a00028 cdw11:a000a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.419 [2024-07-23 05:07:17.255149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.419 [2024-07-23 05:07:17.255285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:a0a000a0 cdw11:2800a028 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.419 [2024-07-23 05:07:17.255308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.419 [2024-07-23 05:07:17.255440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.419 [2024-07-23 05:07:17.255469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.419 #33 NEW cov: 11747 ft: 14526 corp: 23/474b lim: 35 exec/s: 33 rss: 69Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:07:46.419 [2024-07-23 05:07:17.324792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:28750028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.419 [2024-07-23 05:07:17.324827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.420 [2024-07-23 05:07:17.324956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002028 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.420 [2024-07-23 05:07:17.324981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.420 #34 NEW cov: 11747 ft: 14537 corp: 24/488b lim: 35 exec/s: 34 rss: 69Mb L: 14/28 MS: 1 ChangeBit- 00:07:46.420 [2024-07-23 05:07:17.384577] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.420 [2024-07-23 05:07:17.384761] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.420 [2024-07-23 05:07:17.384928] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.420 [2024-07-23 05:07:17.385100] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.420 [2024-07-23 05:07:17.385493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.420 [2024-07-23 05:07:17.385540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.420 [2024-07-23 05:07:17.385671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.420 [2024-07-23 05:07:17.385700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.420 [2024-07-23 05:07:17.385829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.420 [2024-07-23 05:07:17.385858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.420 [2024-07-23 05:07:17.385979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:001c0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.420 [2024-07-23 05:07:17.386007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.420 #35 NEW cov: 11747 ft: 14582 corp: 25/516b lim: 35 exec/s: 35 rss: 69Mb L: 28/28 MS: 1 ChangeBinInt- 00:07:46.420 [2024-07-23 05:07:17.454859] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.420 [2024-07-23 05:07:17.455044] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.420 [2024-07-23 05:07:17.455218] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.420 [2024-07-23 05:07:17.455389] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.420 [2024-07-23 05:07:17.455786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.420 [2024-07-23 05:07:17.455834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.420 [2024-07-23 05:07:17.455978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.420 [2024-07-23 05:07:17.456009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.420 [2024-07-23 05:07:17.456159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.420 [2024-07-23 05:07:17.456187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.420 [2024-07-23 05:07:17.456286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.420 [2024-07-23 05:07:17.456311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.420 #36 NEW cov: 11747 ft: 14593 corp: 26/549b lim: 35 exec/s: 36 rss: 69Mb L: 33/33 MS: 1 CopyPart- 00:07:46.679 [2024-07-23 05:07:17.525477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:2828005f cdw11:2800283f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.679 [2024-07-23 05:07:17.525512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.679 [2024-07-23 05:07:17.525665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28002828 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.679 [2024-07-23 05:07:17.525688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.679 #41 NEW cov: 11747 ft: 14605 corp: 27/565b lim: 35 exec/s: 20 rss: 69Mb L: 16/33 MS: 5 ChangeByte-ShuffleBytes-ChangeByte-ChangeBit-CrossOver- 00:07:46.679 #41 DONE cov: 11747 ft: 14605 corp: 27/565b lim: 35 exec/s: 20 rss: 69Mb 00:07:46.679 Done 41 runs in 2 second(s) 00:07:46.679 05:07:17 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:46.679 05:07:17 -- ../common.sh@72 -- # (( i++ )) 00:07:46.679 05:07:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:46.679 05:07:17 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:46.679 05:07:17 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:46.679 05:07:17 -- nvmf/run.sh@24 -- # local timen=1 00:07:46.679 05:07:17 -- nvmf/run.sh@25 -- # local core=0x1 00:07:46.679 05:07:17 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:46.679 05:07:17 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:46.679 05:07:17 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:46.679 05:07:17 -- nvmf/run.sh@29 -- # port=4403 00:07:46.679 05:07:17 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:46.679 05:07:17 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:46.679 05:07:17 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:46.679 05:07:17 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:46.679 [2024-07-23 05:07:17.735713] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:46.679 [2024-07-23 05:07:17.735782] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3140151 ] 00:07:46.938 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.938 [2024-07-23 05:07:17.951569] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.938 [2024-07-23 05:07:18.026886] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:46.938 [2024-07-23 05:07:18.027061] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.196 [2024-07-23 05:07:18.088042] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.196 [2024-07-23 05:07:18.104364] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:47.196 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.196 INFO: Seed: 1619862110 00:07:47.196 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:47.196 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:47.196 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:47.196 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.196 #2 INITED exec/s: 0 rss: 60Mb 00:07:47.196 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:47.196 This may also happen if the target rejected all inputs we tried so far 00:07:47.763 NEW_FUNC[1/659]: 0x485d00 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:47.763 NEW_FUNC[2/659]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:47.763 #7 NEW cov: 11423 ft: 11424 corp: 2/12b lim: 20 exec/s: 0 rss: 67Mb L: 11/11 MS: 5 ShuffleBytes-ChangeByte-InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:47.763 #13 NEW cov: 11536 ft: 11953 corp: 3/23b lim: 20 exec/s: 0 rss: 67Mb L: 11/11 MS: 1 CopyPart- 00:07:47.763 #14 NEW cov: 11542 ft: 12281 corp: 4/34b lim: 20 exec/s: 0 rss: 67Mb L: 11/11 MS: 1 ChangeBinInt- 00:07:47.763 #15 NEW cov: 11644 ft: 12899 corp: 5/50b lim: 20 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:07:47.763 #16 NEW cov: 11644 ft: 13042 corp: 6/59b lim: 20 exec/s: 0 rss: 67Mb L: 9/16 MS: 1 CMP- DE: "\377/\343F\254\315\314\032"- 00:07:48.021 #17 NEW cov: 11644 ft: 13108 corp: 7/69b lim: 20 exec/s: 0 rss: 67Mb L: 10/16 MS: 1 EraseBytes- 00:07:48.021 NEW_FUNC[1/4]: 0x113e4b0 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:48.021 NEW_FUNC[2/4]: 0x113f030 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:07:48.021 #18 NEW cov: 11727 ft: 13278 corp: 8/80b lim: 20 exec/s: 0 rss: 67Mb L: 11/16 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:48.021 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:48.021 #19 NEW cov: 11750 ft: 13324 corp: 9/91b lim: 20 exec/s: 0 rss: 68Mb L: 11/16 MS: 1 ChangeByte- 00:07:48.021 #20 NEW cov: 11750 ft: 13336 corp: 10/101b lim: 20 exec/s: 0 rss: 68Mb L: 10/16 MS: 1 CrossOver- 00:07:48.279 #21 NEW cov: 11750 ft: 13485 corp: 11/112b lim: 20 exec/s: 21 rss: 68Mb L: 11/16 MS: 1 ShuffleBytes- 00:07:48.279 #22 NEW cov: 11754 ft: 13601 corp: 12/126b lim: 20 exec/s: 22 rss: 68Mb L: 14/16 MS: 1 InsertRepeatedBytes- 00:07:48.279 #23 NEW cov: 11754 ft: 13678 corp: 13/138b lim: 20 exec/s: 23 rss: 68Mb L: 12/16 MS: 1 CrossOver- 00:07:48.279 #24 NEW cov: 11754 ft: 13703 corp: 14/148b lim: 20 exec/s: 24 rss: 68Mb L: 10/16 MS: 1 ChangeBinInt- 00:07:48.536 #25 NEW cov: 11754 ft: 13714 corp: 15/160b lim: 20 exec/s: 25 rss: 68Mb L: 12/16 MS: 1 ChangeByte- 00:07:48.536 #26 NEW cov: 11754 ft: 13732 corp: 16/171b lim: 20 exec/s: 26 rss: 68Mb L: 11/16 MS: 1 ChangeBinInt- 00:07:48.536 [2024-07-23 05:07:19.538264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.536 [2024-07-23 05:07:19.538316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.536 NEW_FUNC[1/15]: 0x1550190 in nvme_ctrlr_process_async_event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3091 00:07:48.536 NEW_FUNC[2/15]: 0x15e5b50 in nvme_ctrlr_queue_async_event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3131 00:07:48.536 #30 NEW cov: 11969 ft: 14005 corp: 17/181b lim: 20 exec/s: 30 rss: 68Mb L: 10/16 MS: 4 CopyPart-ChangeByte-ChangeASCIIInt-CMP- DE: "\002\000\000\000\000\000\000\000"- 00:07:48.794 #31 NEW cov: 11969 ft: 14073 corp: 18/198b lim: 20 exec/s: 31 rss: 68Mb L: 17/17 MS: 1 InsertByte- 00:07:48.794 #32 NEW cov: 11969 ft: 14088 corp: 19/216b lim: 20 exec/s: 32 rss: 69Mb L: 18/18 MS: 1 CrossOver- 00:07:48.794 #33 NEW cov: 11969 ft: 14098 corp: 20/233b lim: 20 exec/s: 33 rss: 69Mb L: 17/18 MS: 1 ChangeBit- 00:07:48.794 #34 NEW cov: 11969 ft: 14108 corp: 21/243b lim: 20 exec/s: 34 rss: 69Mb L: 10/18 MS: 1 CMP- DE: "\0000\343F\254\315\314\032"- 00:07:49.052 #35 NEW cov: 11969 ft: 14138 corp: 22/254b lim: 20 exec/s: 35 rss: 69Mb L: 11/18 MS: 1 ShuffleBytes- 00:07:49.052 #36 NEW cov: 11969 ft: 14158 corp: 23/265b lim: 20 exec/s: 36 rss: 69Mb L: 11/18 MS: 1 CMP- DE: "\377/\343A\334s\326\260"- 00:07:49.052 [2024-07-23 05:07:19.980449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.052 [2024-07-23 05:07:19.980494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.052 #37 NEW cov: 11969 ft: 14320 corp: 24/285b lim: 20 exec/s: 37 rss: 69Mb L: 20/20 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:07:49.052 #38 NEW cov: 11969 ft: 14403 corp: 25/303b lim: 20 exec/s: 38 rss: 69Mb L: 18/20 MS: 1 ChangeByte- 00:07:49.310 #39 NEW cov: 11969 ft: 14637 corp: 26/308b lim: 20 exec/s: 19 rss: 69Mb L: 5/20 MS: 1 CrossOver- 00:07:49.311 #39 DONE cov: 11969 ft: 14637 corp: 26/308b lim: 20 exec/s: 19 rss: 69Mb 00:07:49.311 ###### Recommended dictionary. ###### 00:07:49.311 "\377/\343F\254\315\314\032" # Uses: 0 00:07:49.311 "\000\000\000\000" # Uses: 0 00:07:49.311 "\002\000\000\000\000\000\000\000" # Uses: 1 00:07:49.311 "\0000\343F\254\315\314\032" # Uses: 0 00:07:49.311 "\377/\343A\334s\326\260" # Uses: 0 00:07:49.311 ###### End of recommended dictionary. ###### 00:07:49.311 Done 39 runs in 2 second(s) 00:07:49.311 05:07:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:49.311 05:07:20 -- ../common.sh@72 -- # (( i++ )) 00:07:49.311 05:07:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.311 05:07:20 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:49.311 05:07:20 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:49.311 05:07:20 -- nvmf/run.sh@24 -- # local timen=1 00:07:49.311 05:07:20 -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.311 05:07:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:49.311 05:07:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:49.311 05:07:20 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:49.311 05:07:20 -- nvmf/run.sh@29 -- # port=4404 00:07:49.311 05:07:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:49.311 05:07:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:49.311 05:07:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.311 05:07:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:49.311 [2024-07-23 05:07:20.338055] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:49.311 [2024-07-23 05:07:20.338128] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3140628 ] 00:07:49.311 EAL: No free 2048 kB hugepages reported on node 1 00:07:49.570 [2024-07-23 05:07:20.633457] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.829 [2024-07-23 05:07:20.736048] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:49.829 [2024-07-23 05:07:20.736223] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.829 [2024-07-23 05:07:20.797166] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:49.829 [2024-07-23 05:07:20.813533] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:49.829 INFO: Running with entropic power schedule (0xFF, 100). 00:07:49.829 INFO: Seed: 35888663 00:07:49.829 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:49.829 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:49.829 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:49.829 INFO: A corpus is not provided, starting from an empty corpus 00:07:49.829 #2 INITED exec/s: 0 rss: 60Mb 00:07:49.829 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:49.829 This may also happen if the target rejected all inputs we tried so far 00:07:49.829 [2024-07-23 05:07:20.868941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.829 [2024-07-23 05:07:20.868977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.407 NEW_FUNC[1/671]: 0x486df0 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:50.407 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.407 #3 NEW cov: 11532 ft: 11527 corp: 2/13b lim: 35 exec/s: 0 rss: 66Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:07:50.407 [2024-07-23 05:07:21.310422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00007c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.407 [2024-07-23 05:07:21.310467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.407 [2024-07-23 05:07:21.310533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.407 [2024-07-23 05:07:21.310559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.407 [2024-07-23 05:07:21.310620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.407 [2024-07-23 05:07:21.310637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.407 #5 NEW cov: 11645 ft: 12912 corp: 3/34b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:50.407 [2024-07-23 05:07:21.360455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00007c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.407 [2024-07-23 05:07:21.360488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.407 [2024-07-23 05:07:21.360553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.407 [2024-07-23 05:07:21.360571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.407 [2024-07-23 05:07:21.360634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:5d5d0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.407 [2024-07-23 05:07:21.360651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.407 #6 NEW cov: 11651 ft: 13058 corp: 4/61b lim: 35 exec/s: 0 rss: 67Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:07:50.407 [2024-07-23 05:07:21.420816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.407 [2024-07-23 05:07:21.420848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.407 [2024-07-23 05:07:21.420912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.407 [2024-07-23 05:07:21.420931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.407 [2024-07-23 05:07:21.420993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.407 [2024-07-23 05:07:21.421010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.407 [2024-07-23 05:07:21.421073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.407 [2024-07-23 05:07:21.421091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.407 #7 NEW cov: 11736 ft: 13581 corp: 5/93b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:50.407 [2024-07-23 05:07:21.480439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.407 [2024-07-23 05:07:21.480476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.667 #8 NEW cov: 11736 ft: 13671 corp: 6/105b lim: 35 exec/s: 0 rss: 67Mb L: 12/32 MS: 1 CrossOver- 00:07:50.667 [2024-07-23 05:07:21.530640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.667 [2024-07-23 05:07:21.530673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.667 #11 NEW cov: 11736 ft: 13794 corp: 7/116b lim: 35 exec/s: 0 rss: 67Mb L: 11/32 MS: 3 CopyPart-CrossOver-CrossOver- 00:07:50.667 [2024-07-23 05:07:21.571262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c780000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.667 [2024-07-23 05:07:21.571294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.667 [2024-07-23 05:07:21.571360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.667 [2024-07-23 05:07:21.571379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.667 [2024-07-23 05:07:21.571448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.667 [2024-07-23 05:07:21.571467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.667 [2024-07-23 05:07:21.571532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:007c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.667 [2024-07-23 05:07:21.571549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.667 #12 NEW cov: 11736 ft: 13854 corp: 8/149b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 InsertByte- 00:07:50.667 [2024-07-23 05:07:21.630915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.667 [2024-07-23 05:07:21.630948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.667 #13 NEW cov: 11736 ft: 13888 corp: 9/161b lim: 35 exec/s: 0 rss: 68Mb L: 12/33 MS: 1 ChangeBit- 00:07:50.667 [2024-07-23 05:07:21.681375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.667 [2024-07-23 05:07:21.681407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.667 [2024-07-23 05:07:21.681475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:8f8f8f8f cdw11:8f8f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.667 [2024-07-23 05:07:21.681494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.667 [2024-07-23 05:07:21.681557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:8f8f8f8f cdw11:8f8f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.667 [2024-07-23 05:07:21.681575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.667 #14 NEW cov: 11736 ft: 13956 corp: 10/187b lim: 35 exec/s: 0 rss: 68Mb L: 26/33 MS: 1 InsertRepeatedBytes- 00:07:50.667 [2024-07-23 05:07:21.741330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.667 [2024-07-23 05:07:21.741362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.667 [2024-07-23 05:07:21.741427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.667 [2024-07-23 05:07:21.741451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.927 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:50.927 #15 NEW cov: 11759 ft: 14210 corp: 11/207b lim: 35 exec/s: 0 rss: 68Mb L: 20/33 MS: 1 CMP- DE: "?\000\000\000\000\000\000\000"- 00:07:50.927 [2024-07-23 05:07:21.791895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c780000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.791928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.927 [2024-07-23 05:07:21.791994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.792013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.927 [2024-07-23 05:07:21.792075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.792093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.927 [2024-07-23 05:07:21.792156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:007c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.792174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.927 #21 NEW cov: 11759 ft: 14251 corp: 12/240b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 CopyPart- 00:07:50.927 [2024-07-23 05:07:21.851937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.851969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.927 [2024-07-23 05:07:21.852035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:8f8f8f8f cdw11:8f8b0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.852052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.927 [2024-07-23 05:07:21.852117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:8f8f8f8f cdw11:8f8f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.852134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.927 #22 NEW cov: 11759 ft: 14278 corp: 13/266b lim: 35 exec/s: 22 rss: 69Mb L: 26/33 MS: 1 ChangeBit- 00:07:50.927 [2024-07-23 05:07:21.912267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.912298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.927 [2024-07-23 05:07:21.912362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:8f8f8f8f cdw11:8f8f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.912381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.927 [2024-07-23 05:07:21.912447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:8f8f8f8f cdw11:8f7c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.912465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.927 [2024-07-23 05:07:21.912531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:008f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.912549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.927 #23 NEW cov: 11759 ft: 14327 corp: 14/295b lim: 35 exec/s: 23 rss: 69Mb L: 29/33 MS: 1 CrossOver- 00:07:50.927 [2024-07-23 05:07:21.962355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7e0a7c cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.962386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.927 [2024-07-23 05:07:21.962451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:8f8f8f8f cdw11:8f8f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.962469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.927 [2024-07-23 05:07:21.962533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:8f8f8f8f cdw11:8f8f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.962550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.927 [2024-07-23 05:07:21.962614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.927 [2024-07-23 05:07:21.962633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.927 #24 NEW cov: 11759 ft: 14384 corp: 15/325b lim: 35 exec/s: 24 rss: 69Mb L: 30/33 MS: 1 InsertByte- 00:07:51.187 [2024-07-23 05:07:22.022570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c780001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.022603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.187 [2024-07-23 05:07:22.022654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.022672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.187 [2024-07-23 05:07:22.022738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.022755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.187 [2024-07-23 05:07:22.022820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.022837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.187 #25 NEW cov: 11759 ft: 14404 corp: 16/359b lim: 35 exec/s: 25 rss: 69Mb L: 34/34 MS: 1 InsertByte- 00:07:51.187 [2024-07-23 05:07:22.072481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.072513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.187 [2024-07-23 05:07:22.072578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00005c7c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.072596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.187 [2024-07-23 05:07:22.072659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.072677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.187 #26 NEW cov: 11759 ft: 14429 corp: 17/382b lim: 35 exec/s: 26 rss: 69Mb L: 23/34 MS: 1 CrossOver- 00:07:51.187 [2024-07-23 05:07:22.132933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3f007c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.132964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.187 [2024-07-23 05:07:22.133028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.133046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.187 [2024-07-23 05:07:22.133122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.133139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.187 [2024-07-23 05:07:22.133201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.133219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.187 #27 NEW cov: 11759 ft: 14469 corp: 18/411b lim: 35 exec/s: 27 rss: 69Mb L: 29/34 MS: 1 PersAutoDict- DE: "?\000\000\000\000\000\000\000"- 00:07:51.187 [2024-07-23 05:07:22.182872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.182904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.187 [2024-07-23 05:07:22.182969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:8f8f8f8f cdw11:8f8b0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.182987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.187 [2024-07-23 05:07:22.183053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:8f8f8f8f cdw11:8f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.183071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.187 #28 NEW cov: 11759 ft: 14491 corp: 19/437b lim: 35 exec/s: 28 rss: 69Mb L: 26/34 MS: 1 CrossOver- 00:07:51.187 [2024-07-23 05:07:22.243167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c780000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.243198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.187 [2024-07-23 05:07:22.243262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.187 [2024-07-23 05:07:22.243280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.188 [2024-07-23 05:07:22.243344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.188 [2024-07-23 05:07:22.243360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.188 [2024-07-23 05:07:22.243424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:007c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.188 [2024-07-23 05:07:22.243446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.447 #29 NEW cov: 11759 ft: 14510 corp: 20/470b lim: 35 exec/s: 29 rss: 69Mb L: 33/34 MS: 1 ChangeBinInt- 00:07:51.447 [2024-07-23 05:07:22.302968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.447 [2024-07-23 05:07:22.303000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.447 [2024-07-23 05:07:22.303065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.447 [2024-07-23 05:07:22.303082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.448 #30 NEW cov: 11759 ft: 14529 corp: 21/490b lim: 35 exec/s: 30 rss: 70Mb L: 20/34 MS: 1 CrossOver- 00:07:51.448 [2024-07-23 05:07:22.363361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.363392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.448 [2024-07-23 05:07:22.363469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00010000 cdw11:8f8b0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.363493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.448 [2024-07-23 05:07:22.363559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:8f8f8f8f cdw11:8f8f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.363576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.448 #31 NEW cov: 11759 ft: 14546 corp: 22/516b lim: 35 exec/s: 31 rss: 70Mb L: 26/34 MS: 1 CMP- DE: "\000\000\000\001"- 00:07:51.448 [2024-07-23 05:07:22.403160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a7c0a3f cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.403192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.448 #34 NEW cov: 11759 ft: 14625 corp: 23/525b lim: 35 exec/s: 34 rss: 70Mb L: 9/34 MS: 3 CrossOver-InsertByte-CrossOver- 00:07:51.448 [2024-07-23 05:07:22.453992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.454023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.448 [2024-07-23 05:07:22.454089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00010000 cdw11:8f8b0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.454106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.448 [2024-07-23 05:07:22.454170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:8f8f8f8f cdw11:8f8f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.454187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.448 [2024-07-23 05:07:22.454249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:abab7c7c cdw11:abab0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.454266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.448 [2024-07-23 05:07:22.454330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:abababab cdw11:7c5d0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.454347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.448 #35 NEW cov: 11759 ft: 14699 corp: 24/560b lim: 35 exec/s: 35 rss: 70Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:51.448 [2024-07-23 05:07:22.513836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c780001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.513867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.448 [2024-07-23 05:07:22.513935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.513953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.448 [2024-07-23 05:07:22.514016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.448 [2024-07-23 05:07:22.514034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.707 #36 NEW cov: 11759 ft: 14705 corp: 25/581b lim: 35 exec/s: 36 rss: 70Mb L: 21/35 MS: 1 CrossOver- 00:07:51.707 [2024-07-23 05:07:22.573614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.707 [2024-07-23 05:07:22.573645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.708 #37 NEW cov: 11759 ft: 14715 corp: 26/594b lim: 35 exec/s: 37 rss: 70Mb L: 13/35 MS: 1 InsertByte- 00:07:51.708 [2024-07-23 05:07:22.614065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.708 [2024-07-23 05:07:22.614097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.708 [2024-07-23 05:07:22.614163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00005c7c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.708 [2024-07-23 05:07:22.614181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.708 [2024-07-23 05:07:22.614245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00090000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.708 [2024-07-23 05:07:22.614262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.708 #38 NEW cov: 11759 ft: 14727 corp: 27/617b lim: 35 exec/s: 38 rss: 70Mb L: 23/35 MS: 1 ChangeBinInt- 00:07:51.708 [2024-07-23 05:07:22.673851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c237c cdw11:7c7c0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.708 [2024-07-23 05:07:22.673882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.708 #39 NEW cov: 11759 ft: 14747 corp: 28/629b lim: 35 exec/s: 39 rss: 70Mb L: 12/35 MS: 1 ChangeByte- 00:07:51.708 [2024-07-23 05:07:22.714028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c237c cdw11:7c100002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.708 [2024-07-23 05:07:22.714060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.708 #40 NEW cov: 11759 ft: 14798 corp: 29/642b lim: 35 exec/s: 40 rss: 70Mb L: 13/35 MS: 1 InsertByte- 00:07:51.708 [2024-07-23 05:07:22.764722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00007c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.708 [2024-07-23 05:07:22.764754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.708 [2024-07-23 05:07:22.764822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.708 [2024-07-23 05:07:22.764840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.708 [2024-07-23 05:07:22.764901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.708 [2024-07-23 05:07:22.764919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.708 [2024-07-23 05:07:22.764981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.708 [2024-07-23 05:07:22.764999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.708 #41 NEW cov: 11759 ft: 14810 corp: 30/673b lim: 35 exec/s: 41 rss: 70Mb L: 31/35 MS: 1 InsertRepeatedBytes- 00:07:51.968 [2024-07-23 05:07:22.814837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7c7c0a7c cdw11:7c7c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.968 [2024-07-23 05:07:22.814869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.968 [2024-07-23 05:07:22.814933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:8f8f8b8f cdw11:8f8f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.968 [2024-07-23 05:07:22.814952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.968 [2024-07-23 05:07:22.815014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7c7c8f8f cdw11:abab0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.968 [2024-07-23 05:07:22.815031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.968 [2024-07-23 05:07:22.815092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:abababab cdw11:abab0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.968 [2024-07-23 05:07:22.815110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.968 #42 NEW cov: 11759 ft: 14833 corp: 31/703b lim: 35 exec/s: 21 rss: 70Mb L: 30/35 MS: 1 EraseBytes- 00:07:51.968 #42 DONE cov: 11759 ft: 14833 corp: 31/703b lim: 35 exec/s: 21 rss: 70Mb 00:07:51.968 ###### Recommended dictionary. ###### 00:07:51.968 "?\000\000\000\000\000\000\000" # Uses: 1 00:07:51.968 "\000\000\000\001" # Uses: 0 00:07:51.968 ###### End of recommended dictionary. ###### 00:07:51.968 Done 42 runs in 2 second(s) 00:07:51.968 05:07:22 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:51.968 05:07:22 -- ../common.sh@72 -- # (( i++ )) 00:07:51.968 05:07:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:51.968 05:07:22 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:51.968 05:07:22 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:51.968 05:07:22 -- nvmf/run.sh@24 -- # local timen=1 00:07:51.968 05:07:22 -- nvmf/run.sh@25 -- # local core=0x1 00:07:51.968 05:07:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:51.968 05:07:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:51.968 05:07:22 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:51.968 05:07:22 -- nvmf/run.sh@29 -- # port=4405 00:07:51.968 05:07:22 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:51.968 05:07:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:51.968 05:07:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:51.968 05:07:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:51.968 [2024-07-23 05:07:23.034265] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:51.968 [2024-07-23 05:07:23.034359] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3140993 ] 00:07:52.227 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.227 [2024-07-23 05:07:23.255008] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.487 [2024-07-23 05:07:23.331256] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.487 [2024-07-23 05:07:23.331438] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.487 [2024-07-23 05:07:23.392587] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.487 [2024-07-23 05:07:23.408950] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:52.487 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.487 INFO: Seed: 2631899641 00:07:52.487 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:52.487 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:52.487 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:52.487 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.487 #2 INITED exec/s: 0 rss: 60Mb 00:07:52.487 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:52.487 This may also happen if the target rejected all inputs we tried so far 00:07:52.487 [2024-07-23 05:07:23.464889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.487 [2024-07-23 05:07:23.464925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.487 [2024-07-23 05:07:23.464989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.487 [2024-07-23 05:07:23.465007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.487 [2024-07-23 05:07:23.465070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.487 [2024-07-23 05:07:23.465087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.487 [2024-07-23 05:07:23.465149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.487 [2024-07-23 05:07:23.465167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.427 NEW_FUNC[1/671]: 0x488f80 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:53.427 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.427 #7 NEW cov: 11543 ft: 11534 corp: 2/37b lim: 45 exec/s: 0 rss: 67Mb L: 36/36 MS: 5 CopyPart-InsertByte-CrossOver-EraseBytes-InsertRepeatedBytes- 00:07:53.427 [2024-07-23 05:07:24.427324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d4b1d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.427 [2024-07-23 05:07:24.427366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.427 [2024-07-23 05:07:24.427447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.427 [2024-07-23 05:07:24.427467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.427 [2024-07-23 05:07:24.427537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.427 [2024-07-23 05:07:24.427554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.427 #10 NEW cov: 11656 ft: 12326 corp: 3/68b lim: 45 exec/s: 10 rss: 67Mb L: 31/36 MS: 3 InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:53.427 [2024-07-23 05:07:24.477563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.427 [2024-07-23 05:07:24.477596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.427 [2024-07-23 05:07:24.477664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.427 [2024-07-23 05:07:24.477683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.427 [2024-07-23 05:07:24.477750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.427 [2024-07-23 05:07:24.477769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.427 [2024-07-23 05:07:24.477834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.427 [2024-07-23 05:07:24.477852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.427 #11 NEW cov: 11662 ft: 12647 corp: 4/110b lim: 45 exec/s: 11 rss: 67Mb L: 42/42 MS: 1 CopyPart- 00:07:53.706 [2024-07-23 05:07:24.527721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.706 [2024-07-23 05:07:24.527754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.706 [2024-07-23 05:07:24.527823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.706 [2024-07-23 05:07:24.527842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.706 [2024-07-23 05:07:24.527908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.706 [2024-07-23 05:07:24.527927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.706 [2024-07-23 05:07:24.527995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.706 [2024-07-23 05:07:24.528013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.706 #12 NEW cov: 11747 ft: 12898 corp: 5/152b lim: 45 exec/s: 12 rss: 67Mb L: 42/42 MS: 1 ShuffleBytes- 00:07:53.706 [2024-07-23 05:07:24.587879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.706 [2024-07-23 05:07:24.587915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.587988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.588008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.588074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.588093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.588158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.588176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.707 #13 NEW cov: 11747 ft: 13034 corp: 6/194b lim: 45 exec/s: 13 rss: 67Mb L: 42/42 MS: 1 CopyPart- 00:07:53.707 [2024-07-23 05:07:24.648029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.648064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.648129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.648149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.648215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:2acd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.648234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.648296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.648314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.707 #14 NEW cov: 11747 ft: 13118 corp: 7/236b lim: 45 exec/s: 14 rss: 67Mb L: 42/42 MS: 1 ChangeByte- 00:07:53.707 [2024-07-23 05:07:24.698191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.698224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.698295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.698314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.698380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcd4dcd cdw11:2acd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.698399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.698461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.698492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.707 #15 NEW cov: 11747 ft: 13198 corp: 8/278b lim: 45 exec/s: 15 rss: 68Mb L: 42/42 MS: 1 ChangeBit- 00:07:53.707 [2024-07-23 05:07:24.758409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.758448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.758516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.758535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.758604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcd4d cdw11:cd2a0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.758623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.707 [2024-07-23 05:07:24.758690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.707 [2024-07-23 05:07:24.758709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.707 #16 NEW cov: 11747 ft: 13235 corp: 9/321b lim: 45 exec/s: 16 rss: 68Mb L: 43/43 MS: 1 InsertByte- 00:07:53.966 [2024-07-23 05:07:24.808497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.808531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.966 [2024-07-23 05:07:24.808599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.808618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.966 [2024-07-23 05:07:24.808686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:2acd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.808704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.966 [2024-07-23 05:07:24.808771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdce0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.808788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.966 #17 NEW cov: 11747 ft: 13270 corp: 10/363b lim: 45 exec/s: 17 rss: 68Mb L: 42/43 MS: 1 ChangeBinInt- 00:07:53.966 [2024-07-23 05:07:24.858302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.858334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.966 [2024-07-23 05:07:24.858400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.858420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.966 #20 NEW cov: 11747 ft: 13571 corp: 11/384b lim: 45 exec/s: 20 rss: 68Mb L: 21/43 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:53.966 [2024-07-23 05:07:24.908823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.908855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.966 [2024-07-23 05:07:24.908928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.908948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.966 [2024-07-23 05:07:24.909015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:2acd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.909033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.966 [2024-07-23 05:07:24.909100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.909118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.966 #21 NEW cov: 11747 ft: 13592 corp: 12/426b lim: 45 exec/s: 21 rss: 68Mb L: 42/43 MS: 1 ShuffleBytes- 00:07:53.966 [2024-07-23 05:07:24.948749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.948780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.966 [2024-07-23 05:07:24.948851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.948869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.966 [2024-07-23 05:07:24.948940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:24.948958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.966 #22 NEW cov: 11747 ft: 13620 corp: 13/457b lim: 45 exec/s: 22 rss: 68Mb L: 31/43 MS: 1 InsertRepeatedBytes- 00:07:53.966 [2024-07-23 05:07:25.009078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:25.009111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.966 [2024-07-23 05:07:25.009178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:25.009198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.966 [2024-07-23 05:07:25.009264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.966 [2024-07-23 05:07:25.009283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.967 [2024-07-23 05:07:25.009349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.967 [2024-07-23 05:07:25.009368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.967 #23 NEW cov: 11747 ft: 13650 corp: 14/499b lim: 45 exec/s: 23 rss: 68Mb L: 42/43 MS: 1 CopyPart- 00:07:54.226 [2024-07-23 05:07:25.069240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.069273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.069344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.069364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.069430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:2acd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.069454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.069521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.069539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.226 #24 NEW cov: 11747 ft: 13672 corp: 15/541b lim: 45 exec/s: 24 rss: 68Mb L: 42/43 MS: 1 CrossOver- 00:07:54.226 [2024-07-23 05:07:25.119410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.119446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.119517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cd35cdcd cdw11:32320001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.119536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.119607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.119625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.119694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.119713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.226 #25 NEW cov: 11747 ft: 13725 corp: 16/577b lim: 45 exec/s: 25 rss: 68Mb L: 36/43 MS: 1 ChangeBinInt- 00:07:54.226 [2024-07-23 05:07:25.169535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.169568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.169639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.169658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.169722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcd4d cdw11:cd2a0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.169741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.169803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.169822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.226 #26 NEW cov: 11747 ft: 13749 corp: 17/620b lim: 45 exec/s: 26 rss: 68Mb L: 43/43 MS: 1 ShuffleBytes- 00:07:54.226 [2024-07-23 05:07:25.219674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.219705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.219772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.219792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.219860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcd4dcd cdw11:2acd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.219878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.219939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.219957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.226 #27 NEW cov: 11747 ft: 13756 corp: 18/662b lim: 45 exec/s: 27 rss: 68Mb L: 42/43 MS: 1 CrossOver- 00:07:54.226 [2024-07-23 05:07:25.259481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.259515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.226 [2024-07-23 05:07:25.259583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.259602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.226 #28 NEW cov: 11747 ft: 13780 corp: 19/684b lim: 45 exec/s: 28 rss: 69Mb L: 22/43 MS: 1 InsertByte- 00:07:54.226 [2024-07-23 05:07:25.309419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.226 [2024-07-23 05:07:25.309456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.485 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.485 #29 NEW cov: 11770 ft: 14570 corp: 20/700b lim: 45 exec/s: 29 rss: 69Mb L: 16/43 MS: 1 EraseBytes- 00:07:54.485 [2024-07-23 05:07:25.359946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.485 [2024-07-23 05:07:25.359977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.485 [2024-07-23 05:07:25.360048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cd35cdcd cdw11:32320001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.485 [2024-07-23 05:07:25.360068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.485 [2024-07-23 05:07:25.360136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.485 [2024-07-23 05:07:25.360156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.485 #30 NEW cov: 11770 ft: 14602 corp: 21/733b lim: 45 exec/s: 30 rss: 69Mb L: 33/43 MS: 1 CrossOver- 00:07:54.485 [2024-07-23 05:07:25.420150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.485 [2024-07-23 05:07:25.420186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.485 [2024-07-23 05:07:25.420254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:cd2acdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.485 [2024-07-23 05:07:25.420273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.485 [2024-07-23 05:07:25.420341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:cdcdcdcd cdw11:cdcd0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.485 [2024-07-23 05:07:25.420358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.485 #31 NEW cov: 11770 ft: 14632 corp: 22/765b lim: 45 exec/s: 15 rss: 69Mb L: 32/43 MS: 1 EraseBytes- 00:07:54.485 #31 DONE cov: 11770 ft: 14632 corp: 22/765b lim: 45 exec/s: 15 rss: 69Mb 00:07:54.485 Done 31 runs in 2 second(s) 00:07:54.745 05:07:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:54.745 05:07:25 -- ../common.sh@72 -- # (( i++ )) 00:07:54.745 05:07:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.745 05:07:25 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:54.745 05:07:25 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:54.745 05:07:25 -- nvmf/run.sh@24 -- # local timen=1 00:07:54.745 05:07:25 -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.745 05:07:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:54.745 05:07:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:54.745 05:07:25 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:54.745 05:07:25 -- nvmf/run.sh@29 -- # port=4406 00:07:54.745 05:07:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:54.745 05:07:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:54.745 05:07:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.745 05:07:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:54.745 [2024-07-23 05:07:25.637524] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:54.745 [2024-07-23 05:07:25.637602] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3141532 ] 00:07:54.745 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.056 [2024-07-23 05:07:25.848595] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.056 [2024-07-23 05:07:25.923788] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:55.056 [2024-07-23 05:07:25.923963] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.056 [2024-07-23 05:07:25.984936] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.056 [2024-07-23 05:07:26.001300] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:55.056 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.056 INFO: Seed: 927930715 00:07:55.056 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:55.056 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:55.056 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:55.056 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.056 #2 INITED exec/s: 0 rss: 60Mb 00:07:55.056 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:55.056 This may also happen if the target rejected all inputs we tried so far 00:07:55.056 [2024-07-23 05:07:26.056661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a62 cdw11:00000000 00:07:55.056 [2024-07-23 05:07:26.056698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.625 NEW_FUNC[1/669]: 0x48b790 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:55.625 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:55.625 #3 NEW cov: 11460 ft: 11461 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 InsertByte- 00:07:55.625 [2024-07-23 05:07:26.498336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a62 cdw11:00000000 00:07:55.625 [2024-07-23 05:07:26.498378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.625 [2024-07-23 05:07:26.498439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d4bc cdw11:00000000 00:07:55.625 [2024-07-23 05:07:26.498463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.625 [2024-07-23 05:07:26.498537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f6f2 cdw11:00000000 00:07:55.625 [2024-07-23 05:07:26.498555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.625 [2024-07-23 05:07:26.498621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000045e3 cdw11:00000000 00:07:55.625 [2024-07-23 05:07:26.498638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.625 [2024-07-23 05:07:26.498700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00003000 cdw11:00000000 00:07:55.625 [2024-07-23 05:07:26.498717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.625 #4 NEW cov: 11573 ft: 12155 corp: 3/13b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 CMP- DE: "\324\274\366\362E\3430\000"- 00:07:55.625 [2024-07-23 05:07:26.557832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f662 cdw11:00000000 00:07:55.625 [2024-07-23 05:07:26.557864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.625 #5 NEW cov: 11579 ft: 12533 corp: 4/15b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 CrossOver- 00:07:55.626 [2024-07-23 05:07:26.608475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a33 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.608510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.608575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005e68 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.608593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.608656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c045 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.608673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.608733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e330 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.608751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.608819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000062 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.608837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.626 #6 NEW cov: 11664 ft: 12874 corp: 5/25b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 CMP- DE: "3^h\300E\3430\000"- 00:07:55.626 [2024-07-23 05:07:26.658620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a33 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.658652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.658716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000045e3 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.658733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.658792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00003045 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.658809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.658869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e330 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.658886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.658945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000062 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.658962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.626 #7 NEW cov: 11664 ft: 12970 corp: 6/35b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 CrossOver- 00:07:55.626 [2024-07-23 05:07:26.708779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.708811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.708873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003345 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.708890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.708946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e345 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.708963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.709024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e330 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.709042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.626 [2024-07-23 05:07:26.709101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000062 cdw11:00000000 00:07:55.626 [2024-07-23 05:07:26.709119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.886 #8 NEW cov: 11664 ft: 13115 corp: 7/45b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 CrossOver- 00:07:55.886 [2024-07-23 05:07:26.758433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f662 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.758469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.886 #9 NEW cov: 11664 ft: 13177 corp: 8/47b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 CopyPart- 00:07:55.886 [2024-07-23 05:07:26.809028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.809058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.809120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003345 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.809138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.809194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c345 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.809211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.809272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e330 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.809289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.809347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000062 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.809364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.886 #10 NEW cov: 11664 ft: 13263 corp: 9/57b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:07:55.886 [2024-07-23 05:07:26.858716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f6f6 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.858747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.886 #11 NEW cov: 11664 ft: 13319 corp: 10/59b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 CopyPart- 00:07:55.886 [2024-07-23 05:07:26.909381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000bc45 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.909412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.909482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000f6f2 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.909501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.909560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d40a cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.909577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.909637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e362 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.909655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.909712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00003000 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.909730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.886 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:55.886 #12 NEW cov: 11687 ft: 13368 corp: 11/69b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:55.886 [2024-07-23 05:07:26.969550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000bc45 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.969580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.969645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000f645 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.969662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.969722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d40a cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.969739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.969800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e362 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.969819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.886 [2024-07-23 05:07:26.969880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00003000 cdw11:00000000 00:07:55.886 [2024-07-23 05:07:26.969898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.146 #13 NEW cov: 11687 ft: 13409 corp: 12/79b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CrossOver- 00:07:56.146 [2024-07-23 05:07:27.019709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.019739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.019799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003345 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.019816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.019880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e345 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.019897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.019959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e330 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.019976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.020034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000009a cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.020051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.146 #14 NEW cov: 11687 ft: 13459 corp: 13/89b lim: 10 exec/s: 14 rss: 68Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:56.146 [2024-07-23 05:07:27.059718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000335e cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.059749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.059810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000068c0 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.059828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.059890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000045e3 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.059907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.059969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00003000 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.059990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.146 #17 NEW cov: 11687 ft: 13470 corp: 14/98b lim: 10 exec/s: 17 rss: 68Mb L: 9/10 MS: 3 ChangeByte-ChangeBit-PersAutoDict- DE: "3^h\300E\3430\000"- 00:07:56.146 [2024-07-23 05:07:27.109669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a33 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.109700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.109765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000045e3 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.109782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.109847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00003000 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.109864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.146 #18 NEW cov: 11687 ft: 13646 corp: 15/105b lim: 10 exec/s: 18 rss: 68Mb L: 7/10 MS: 1 EraseBytes- 00:07:56.146 [2024-07-23 05:07:27.160112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d4bc cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.160143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.160207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000f6f2 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.160226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.160284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000045e3 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.160302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.160365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00003000 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.160383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.160446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000009a cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.160464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.146 #19 NEW cov: 11687 ft: 13657 corp: 16/115b lim: 10 exec/s: 19 rss: 68Mb L: 10/10 MS: 1 PersAutoDict- DE: "\324\274\366\362E\3430\000"- 00:07:56.146 [2024-07-23 05:07:27.209981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.210012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.210075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003345 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.210093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.146 [2024-07-23 05:07:27.210155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e345 cdw11:00000000 00:07:56.146 [2024-07-23 05:07:27.210172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.146 #20 NEW cov: 11687 ft: 13739 corp: 17/122b lim: 10 exec/s: 20 rss: 68Mb L: 7/10 MS: 1 CrossOver- 00:07:56.406 [2024-07-23 05:07:27.250341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001962 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.250377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.250446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d4bc cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.250465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.250529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f6f2 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.250545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.250609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000045e3 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.250629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.250691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00003000 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.250708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.406 #21 NEW cov: 11687 ft: 13750 corp: 18/132b lim: 10 exec/s: 21 rss: 68Mb L: 10/10 MS: 1 ChangeByte- 00:07:56.406 [2024-07-23 05:07:27.289890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f662 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.289921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.406 #22 NEW cov: 11687 ft: 13780 corp: 19/134b lim: 10 exec/s: 22 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:56.406 [2024-07-23 05:07:27.330619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001962 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.330650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.330712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d4b4 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.330730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.330791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f6f2 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.330809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.330868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000045e3 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.330885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.330946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00003000 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.330963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.406 #23 NEW cov: 11687 ft: 13794 corp: 20/144b lim: 10 exec/s: 23 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:07:56.406 [2024-07-23 05:07:27.380756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000bc45 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.380788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.380851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000f6f2 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.380872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.380931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d4bc cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.380949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.381009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000045f6 cdw11:00000000 00:07:56.406 [2024-07-23 05:07:27.381027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.406 [2024-07-23 05:07:27.381087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00003000 cdw11:00000000 00:07:56.407 [2024-07-23 05:07:27.381104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.407 #24 NEW cov: 11687 ft: 13866 corp: 21/154b lim: 10 exec/s: 24 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:56.407 [2024-07-23 05:07:27.420815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000bc45 cdw11:00000000 00:07:56.407 [2024-07-23 05:07:27.420846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.407 [2024-07-23 05:07:27.420908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c345 cdw11:00000000 00:07:56.407 [2024-07-23 05:07:27.420925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.407 [2024-07-23 05:07:27.420988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e330 cdw11:00000000 00:07:56.407 [2024-07-23 05:07:27.421005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.407 [2024-07-23 05:07:27.421065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000062 cdw11:00000000 00:07:56.407 [2024-07-23 05:07:27.421083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.407 [2024-07-23 05:07:27.421143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00003000 cdw11:00000000 00:07:56.407 [2024-07-23 05:07:27.421160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.407 #25 NEW cov: 11687 ft: 13879 corp: 22/164b lim: 10 exec/s: 25 rss: 69Mb L: 10/10 MS: 1 CrossOver- 00:07:56.407 [2024-07-23 05:07:27.470497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f662 cdw11:00000000 00:07:56.407 [2024-07-23 05:07:27.470527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.407 #26 NEW cov: 11687 ft: 13922 corp: 23/166b lim: 10 exec/s: 26 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:56.667 [2024-07-23 05:07:27.510746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f6f6 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.510776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.510840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006262 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.510857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.667 #27 NEW cov: 11687 ft: 14064 corp: 24/170b lim: 10 exec/s: 27 rss: 69Mb L: 4/10 MS: 1 CopyPart- 00:07:56.667 [2024-07-23 05:07:27.550737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.550770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.667 #28 NEW cov: 11687 ft: 14069 corp: 25/172b lim: 10 exec/s: 28 rss: 69Mb L: 2/10 MS: 1 CrossOver- 00:07:56.667 [2024-07-23 05:07:27.591351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001962 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.591382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.591449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d4bc cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.591467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.591531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f6f2 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.591549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.591612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000045e3 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.591629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.591692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.591709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.667 #29 NEW cov: 11687 ft: 14081 corp: 26/182b lim: 10 exec/s: 29 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:56.667 [2024-07-23 05:07:27.631500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001962 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.631531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.631594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d419 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.631612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.631675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000062f2 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.631693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.631754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000045e3 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.631771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.631833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.631849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.667 #30 NEW cov: 11687 ft: 14085 corp: 27/192b lim: 10 exec/s: 30 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:56.667 [2024-07-23 05:07:27.681662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.681693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.681755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003330 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.681773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.681837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000045 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.681854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.681915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e3e3 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.681933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.681990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000459a cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.682007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.667 #31 NEW cov: 11687 ft: 14101 corp: 28/202b lim: 10 exec/s: 31 rss: 69Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:56.667 [2024-07-23 05:07:27.721753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001962 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.721785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.721848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000bcd4 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.721866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.721927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000045 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.721945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.722003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000f2e3 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.722021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.667 [2024-07-23 05:07:27.722079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000f600 cdw11:00000000 00:07:56.667 [2024-07-23 05:07:27.722096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.667 #32 NEW cov: 11687 ft: 14128 corp: 29/212b lim: 10 exec/s: 32 rss: 69Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:56.927 [2024-07-23 05:07:27.761431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f6cc cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.761470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.761531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006262 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.761548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.927 #33 NEW cov: 11687 ft: 14142 corp: 30/216b lim: 10 exec/s: 33 rss: 69Mb L: 4/10 MS: 1 ChangeByte- 00:07:56.927 [2024-07-23 05:07:27.811962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a33 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.811993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.812055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005e68 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.812073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.812134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c045 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.812154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.812215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e331 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.812233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.812287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000062 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.812304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.927 #34 NEW cov: 11687 ft: 14146 corp: 31/226b lim: 10 exec/s: 34 rss: 69Mb L: 10/10 MS: 1 ChangeASCIIInt- 00:07:56.927 [2024-07-23 05:07:27.851649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f662 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.851680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.927 #35 NEW cov: 11687 ft: 14154 corp: 32/228b lim: 10 exec/s: 35 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:56.927 [2024-07-23 05:07:27.902137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000abc cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.902167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.902229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000045f6 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.902248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.902306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f2d4 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.902323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.902383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000bc45 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.902399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.927 #36 NEW cov: 11687 ft: 14161 corp: 33/237b lim: 10 exec/s: 36 rss: 69Mb L: 9/10 MS: 1 CrossOver- 00:07:56.927 [2024-07-23 05:07:27.942186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f645 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.942217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.942280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e300 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.942298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.942359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000000cc cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.942376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.942437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006262 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.942458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.927 #37 NEW cov: 11687 ft: 14211 corp: 34/245b lim: 10 exec/s: 37 rss: 69Mb L: 8/10 MS: 1 CrossOver- 00:07:56.927 [2024-07-23 05:07:27.992298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a33 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.992329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.992393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000045e3 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.992411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.927 [2024-07-23 05:07:27.992479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00003000 cdw11:00000000 00:07:56.927 [2024-07-23 05:07:27.992497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.278 #38 NEW cov: 11687 ft: 14212 corp: 35/252b lim: 10 exec/s: 38 rss: 69Mb L: 7/10 MS: 1 ShuffleBytes- 00:07:57.278 [2024-07-23 05:07:28.042483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001962 cdw11:00000000 00:07:57.278 [2024-07-23 05:07:28.042514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.278 [2024-07-23 05:07:28.042576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000045e3 cdw11:00000000 00:07:57.278 [2024-07-23 05:07:28.042595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.278 [2024-07-23 05:07:28.042657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.278 [2024-07-23 05:07:28.042674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.278 #39 NEW cov: 11687 ft: 14217 corp: 36/258b lim: 10 exec/s: 19 rss: 70Mb L: 6/10 MS: 1 EraseBytes- 00:07:57.278 #39 DONE cov: 11687 ft: 14217 corp: 36/258b lim: 10 exec/s: 19 rss: 70Mb 00:07:57.278 ###### Recommended dictionary. ###### 00:07:57.278 "\324\274\366\362E\3430\000" # Uses: 1 00:07:57.278 "3^h\300E\3430\000" # Uses: 1 00:07:57.278 ###### End of recommended dictionary. ###### 00:07:57.278 Done 39 runs in 2 second(s) 00:07:57.278 05:07:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:57.278 05:07:28 -- ../common.sh@72 -- # (( i++ )) 00:07:57.278 05:07:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.278 05:07:28 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:57.278 05:07:28 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:57.278 05:07:28 -- nvmf/run.sh@24 -- # local timen=1 00:07:57.278 05:07:28 -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.278 05:07:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:57.278 05:07:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:57.278 05:07:28 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:57.278 05:07:28 -- nvmf/run.sh@29 -- # port=4407 00:07:57.278 05:07:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:57.278 05:07:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:57.278 05:07:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.278 05:07:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:57.278 [2024-07-23 05:07:28.250454] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:57.278 [2024-07-23 05:07:28.250534] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3142063 ] 00:07:57.278 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.537 [2024-07-23 05:07:28.472249] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.537 [2024-07-23 05:07:28.549983] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.537 [2024-07-23 05:07:28.550161] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.537 [2024-07-23 05:07:28.611264] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.537 [2024-07-23 05:07:28.627622] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:57.796 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.796 INFO: Seed: 3554930961 00:07:57.797 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:57.797 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:57.797 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:57.797 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.797 #2 INITED exec/s: 0 rss: 60Mb 00:07:57.797 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.797 This may also happen if the target rejected all inputs we tried so far 00:07:57.797 [2024-07-23 05:07:28.683040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003600 cdw11:00000000 00:07:57.797 [2024-07-23 05:07:28.683077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.056 NEW_FUNC[1/669]: 0x48c180 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:58.056 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.056 #4 NEW cov: 11454 ft: 11455 corp: 2/4b lim: 10 exec/s: 0 rss: 67Mb L: 3/3 MS: 2 ChangeBit-CMP- DE: "6\000"- 00:07:58.056 [2024-07-23 05:07:29.124134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003600 cdw11:00000000 00:07:58.056 [2024-07-23 05:07:29.124175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.316 #5 NEW cov: 11573 ft: 12076 corp: 3/7b lim: 10 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ChangeByte- 00:07:58.316 [2024-07-23 05:07:29.184180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.316 [2024-07-23 05:07:29.184213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.316 #6 NEW cov: 11579 ft: 12289 corp: 4/9b lim: 10 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 CrossOver- 00:07:58.316 [2024-07-23 05:07:29.224259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.316 [2024-07-23 05:07:29.224291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.316 #7 NEW cov: 11664 ft: 12603 corp: 5/11b lim: 10 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 CrossOver- 00:07:58.316 [2024-07-23 05:07:29.274460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001600 cdw11:00000000 00:07:58.316 [2024-07-23 05:07:29.274493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.316 #8 NEW cov: 11664 ft: 12763 corp: 6/14b lim: 10 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ChangeBit- 00:07:58.316 [2024-07-23 05:07:29.324553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001700 cdw11:00000000 00:07:58.316 [2024-07-23 05:07:29.324586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.316 #9 NEW cov: 11664 ft: 12801 corp: 7/17b lim: 10 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ChangeBit- 00:07:58.316 [2024-07-23 05:07:29.374971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003600 cdw11:00000000 00:07:58.316 [2024-07-23 05:07:29.375007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.316 [2024-07-23 05:07:29.375072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002601 cdw11:00000000 00:07:58.316 [2024-07-23 05:07:29.375090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.316 [2024-07-23 05:07:29.375152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.316 [2024-07-23 05:07:29.375170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.316 #10 NEW cov: 11664 ft: 13047 corp: 8/24b lim: 10 exec/s: 0 rss: 67Mb L: 7/7 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:58.576 [2024-07-23 05:07:29.424883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000002a cdw11:00000000 00:07:58.576 [2024-07-23 05:07:29.424915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.576 #11 NEW cov: 11664 ft: 13069 corp: 9/26b lim: 10 exec/s: 0 rss: 68Mb L: 2/7 MS: 1 CrossOver- 00:07:58.576 [2024-07-23 05:07:29.464971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a02 cdw11:00000000 00:07:58.576 [2024-07-23 05:07:29.465003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.576 #12 NEW cov: 11664 ft: 13100 corp: 10/28b lim: 10 exec/s: 0 rss: 68Mb L: 2/7 MS: 1 ChangeBit- 00:07:58.576 [2024-07-23 05:07:29.505084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.576 [2024-07-23 05:07:29.505115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.576 #15 NEW cov: 11664 ft: 13159 corp: 11/30b lim: 10 exec/s: 0 rss: 68Mb L: 2/7 MS: 3 CopyPart-ShuffleBytes-CopyPart- 00:07:58.576 [2024-07-23 05:07:29.545361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000366f cdw11:00000000 00:07:58.576 [2024-07-23 05:07:29.545394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.576 [2024-07-23 05:07:29.545458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000002a cdw11:00000000 00:07:58.576 [2024-07-23 05:07:29.545477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.576 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.576 #16 NEW cov: 11687 ft: 13357 corp: 12/34b lim: 10 exec/s: 0 rss: 68Mb L: 4/7 MS: 1 InsertByte- 00:07:58.576 [2024-07-23 05:07:29.595584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003600 cdw11:00000000 00:07:58.576 [2024-07-23 05:07:29.595617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.576 [2024-07-23 05:07:29.595678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002a3f cdw11:00000000 00:07:58.576 [2024-07-23 05:07:29.595696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.576 [2024-07-23 05:07:29.595756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003f3f cdw11:00000000 00:07:58.576 [2024-07-23 05:07:29.595773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.576 #17 NEW cov: 11687 ft: 13401 corp: 13/41b lim: 10 exec/s: 0 rss: 68Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:07:58.576 [2024-07-23 05:07:29.635518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000036 cdw11:00000000 00:07:58.576 [2024-07-23 05:07:29.635554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.576 #18 NEW cov: 11687 ft: 13506 corp: 14/44b lim: 10 exec/s: 0 rss: 68Mb L: 3/7 MS: 1 ShuffleBytes- 00:07:58.836 [2024-07-23 05:07:29.675610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001600 cdw11:00000000 00:07:58.836 [2024-07-23 05:07:29.675642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.836 #19 NEW cov: 11687 ft: 13551 corp: 15/47b lim: 10 exec/s: 19 rss: 68Mb L: 3/7 MS: 1 ChangeBit- 00:07:58.836 [2024-07-23 05:07:29.715701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001600 cdw11:00000000 00:07:58.836 [2024-07-23 05:07:29.715733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.836 #20 NEW cov: 11687 ft: 13554 corp: 16/50b lim: 10 exec/s: 20 rss: 68Mb L: 3/7 MS: 1 ChangeByte- 00:07:58.836 [2024-07-23 05:07:29.756123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:58.836 [2024-07-23 05:07:29.756155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.836 [2024-07-23 05:07:29.756216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff0d cdw11:00000000 00:07:58.836 [2024-07-23 05:07:29.756234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.836 [2024-07-23 05:07:29.756294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.836 [2024-07-23 05:07:29.756313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.836 #21 NEW cov: 11687 ft: 13676 corp: 17/56b lim: 10 exec/s: 21 rss: 68Mb L: 6/7 MS: 1 CMP- DE: "\377\377\377\015"- 00:07:58.836 [2024-07-23 05:07:29.806419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:58.836 [2024-07-23 05:07:29.806456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.836 [2024-07-23 05:07:29.806518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:58.836 [2024-07-23 05:07:29.806536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.836 [2024-07-23 05:07:29.806597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:58.836 [2024-07-23 05:07:29.806616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.836 [2024-07-23 05:07:29.806676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.836 [2024-07-23 05:07:29.806693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.836 #22 NEW cov: 11687 ft: 13891 corp: 18/64b lim: 10 exec/s: 22 rss: 68Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:58.836 [2024-07-23 05:07:29.856097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000340a cdw11:00000000 00:07:58.836 [2024-07-23 05:07:29.856128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.836 #23 NEW cov: 11687 ft: 13903 corp: 19/66b lim: 10 exec/s: 23 rss: 68Mb L: 2/8 MS: 1 InsertByte- 00:07:58.836 [2024-07-23 05:07:29.896262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003600 cdw11:00000000 00:07:58.836 [2024-07-23 05:07:29.896298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.836 #24 NEW cov: 11687 ft: 13913 corp: 20/69b lim: 10 exec/s: 24 rss: 68Mb L: 3/8 MS: 1 ChangeByte- 00:07:59.096 [2024-07-23 05:07:29.936734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008686 cdw11:00000000 00:07:59.096 [2024-07-23 05:07:29.936766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:29.936827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008686 cdw11:00000000 00:07:59.096 [2024-07-23 05:07:29.936845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:29.936903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008686 cdw11:00000000 00:07:59.096 [2024-07-23 05:07:29.936921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:29.936979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00008624 cdw11:00000000 00:07:59.096 [2024-07-23 05:07:29.936996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.096 #28 NEW cov: 11687 ft: 13945 corp: 21/77b lim: 10 exec/s: 28 rss: 68Mb L: 8/8 MS: 4 EraseBytes-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:07:59.096 [2024-07-23 05:07:29.986479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000036 cdw11:00000000 00:07:59.096 [2024-07-23 05:07:29.986510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.096 #29 NEW cov: 11687 ft: 13963 corp: 22/80b lim: 10 exec/s: 29 rss: 68Mb L: 3/8 MS: 1 ShuffleBytes- 00:07:59.096 [2024-07-23 05:07:30.027018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.027050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:30.027112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.027129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:30.027187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000db2d cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.027204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:30.027263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000f50a cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.027281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.096 #30 NEW cov: 11687 ft: 13973 corp: 23/88b lim: 10 exec/s: 30 rss: 68Mb L: 8/8 MS: 1 ChangeBinInt- 00:07:59.096 [2024-07-23 05:07:30.077278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000340a cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.077310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:30.077379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000030 cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.077397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:30.077457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000e34c cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.077478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:30.077537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000c883 cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.077553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:30.077611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000fd80 cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.077628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.096 #31 NEW cov: 11687 ft: 14014 corp: 24/98b lim: 10 exec/s: 31 rss: 68Mb L: 10/10 MS: 1 CMP- DE: "\0000\343L\310\203\375\200"- 00:07:59.096 [2024-07-23 05:07:30.117076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003600 cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.117108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:30.117169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002601 cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.117187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:30.117251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.117268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.096 #32 NEW cov: 11687 ft: 14029 corp: 25/105b lim: 10 exec/s: 32 rss: 68Mb L: 7/10 MS: 1 ShuffleBytes- 00:07:59.096 [2024-07-23 05:07:30.167090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.167121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.096 [2024-07-23 05:07:30.167182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:59.096 [2024-07-23 05:07:30.167200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.356 #33 NEW cov: 11687 ft: 14042 corp: 26/109b lim: 10 exec/s: 33 rss: 68Mb L: 4/10 MS: 1 EraseBytes- 00:07:59.356 [2024-07-23 05:07:30.227520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:59.356 [2024-07-23 05:07:30.227552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.356 [2024-07-23 05:07:30.227623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:59.356 [2024-07-23 05:07:30.227642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.356 [2024-07-23 05:07:30.227701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000db2d cdw11:00000000 00:07:59.356 [2024-07-23 05:07:30.227718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.356 [2024-07-23 05:07:30.227776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000f00a cdw11:00000000 00:07:59.356 [2024-07-23 05:07:30.227793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.356 #34 NEW cov: 11687 ft: 14082 corp: 27/117b lim: 10 exec/s: 34 rss: 68Mb L: 8/10 MS: 1 ChangeBinInt- 00:07:59.356 [2024-07-23 05:07:30.277287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:59.356 [2024-07-23 05:07:30.277322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.356 #35 NEW cov: 11687 ft: 14092 corp: 28/119b lim: 10 exec/s: 35 rss: 68Mb L: 2/10 MS: 1 CrossOver- 00:07:59.356 [2024-07-23 05:07:30.317833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:59.356 [2024-07-23 05:07:30.317864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.356 [2024-07-23 05:07:30.317922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:59.356 [2024-07-23 05:07:30.317940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.356 [2024-07-23 05:07:30.317998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:59.356 [2024-07-23 05:07:30.318015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.356 [2024-07-23 05:07:30.318071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:59.356 [2024-07-23 05:07:30.318089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.356 #36 NEW cov: 11687 ft: 14139 corp: 29/127b lim: 10 exec/s: 36 rss: 68Mb L: 8/10 MS: 1 ShuffleBytes- 00:07:59.356 [2024-07-23 05:07:30.357525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:07:59.356 [2024-07-23 05:07:30.357556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.356 #37 NEW cov: 11687 ft: 14142 corp: 30/129b lim: 10 exec/s: 37 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:07:59.356 [2024-07-23 05:07:30.407683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001400 cdw11:00000000 00:07:59.356 [2024-07-23 05:07:30.407715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.356 #38 NEW cov: 11687 ft: 14162 corp: 31/132b lim: 10 exec/s: 38 rss: 69Mb L: 3/10 MS: 1 ChangeBit- 00:07:59.616 [2024-07-23 05:07:30.458220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.458252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.616 [2024-07-23 05:07:30.458313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.458331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.616 [2024-07-23 05:07:30.458389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000db2d cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.458407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.616 [2024-07-23 05:07:30.458481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000f50a cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.458499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.616 #39 NEW cov: 11687 ft: 14168 corp: 32/140b lim: 10 exec/s: 39 rss: 69Mb L: 8/10 MS: 1 ShuffleBytes- 00:07:59.616 [2024-07-23 05:07:30.498000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.498031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.616 [2024-07-23 05:07:30.498097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000d0a cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.498114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.616 #40 NEW cov: 11687 ft: 14170 corp: 33/145b lim: 10 exec/s: 40 rss: 69Mb L: 5/10 MS: 1 EraseBytes- 00:07:59.616 [2024-07-23 05:07:30.538320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000bfff cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.538350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.616 [2024-07-23 05:07:30.538409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff0d cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.538427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.616 [2024-07-23 05:07:30.538489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.538507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.616 #41 NEW cov: 11687 ft: 14187 corp: 34/151b lim: 10 exec/s: 41 rss: 69Mb L: 6/10 MS: 1 ChangeBit- 00:07:59.616 [2024-07-23 05:07:30.578561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0d cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.578591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.616 [2024-07-23 05:07:30.578650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000d0d cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.578668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.616 [2024-07-23 05:07:30.578727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000d0d cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.578745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.616 [2024-07-23 05:07:30.578802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000d0d cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.578819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.616 #42 NEW cov: 11687 ft: 14235 corp: 35/160b lim: 10 exec/s: 42 rss: 69Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:59.616 [2024-07-23 05:07:30.618547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003600 cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.618577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.616 [2024-07-23 05:07:30.618637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002601 cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.618655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.616 [2024-07-23 05:07:30.618716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.618733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.616 #43 NEW cov: 11687 ft: 14249 corp: 36/167b lim: 10 exec/s: 43 rss: 69Mb L: 7/10 MS: 1 ChangeBit- 00:07:59.616 [2024-07-23 05:07:30.668462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000036 cdw11:00000000 00:07:59.616 [2024-07-23 05:07:30.668493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.616 #44 NEW cov: 11687 ft: 14260 corp: 37/169b lim: 10 exec/s: 22 rss: 69Mb L: 2/10 MS: 1 EraseBytes- 00:07:59.616 #44 DONE cov: 11687 ft: 14260 corp: 37/169b lim: 10 exec/s: 22 rss: 69Mb 00:07:59.616 ###### Recommended dictionary. ###### 00:07:59.616 "6\000" # Uses: 0 00:07:59.616 "\001\000\000\000" # Uses: 0 00:07:59.616 "\377\377\377\015" # Uses: 0 00:07:59.616 "\0000\343L\310\203\375\200" # Uses: 0 00:07:59.616 ###### End of recommended dictionary. ###### 00:07:59.616 Done 44 runs in 2 second(s) 00:07:59.876 05:07:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:59.876 05:07:30 -- ../common.sh@72 -- # (( i++ )) 00:07:59.876 05:07:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.876 05:07:30 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:59.876 05:07:30 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:59.876 05:07:30 -- nvmf/run.sh@24 -- # local timen=1 00:07:59.876 05:07:30 -- nvmf/run.sh@25 -- # local core=0x1 00:07:59.876 05:07:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:59.876 05:07:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:59.876 05:07:30 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:59.876 05:07:30 -- nvmf/run.sh@29 -- # port=4408 00:07:59.876 05:07:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:59.876 05:07:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:59.876 05:07:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:59.876 05:07:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:59.876 [2024-07-23 05:07:30.884820] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:59.876 [2024-07-23 05:07:30.884892] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3142382 ] 00:07:59.876 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.135 [2024-07-23 05:07:31.111236] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.135 [2024-07-23 05:07:31.189156] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.135 [2024-07-23 05:07:31.189331] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.394 [2024-07-23 05:07:31.250278] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.394 [2024-07-23 05:07:31.266633] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:00.394 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.394 INFO: Seed: 1897958569 00:08:00.394 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:00.394 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:00.394 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:00.394 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.394 [2024-07-23 05:07:31.322191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.394 [2024-07-23 05:07:31.322229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.394 #2 INITED cov: 11488 ft: 11485 corp: 1/1b exec/s: 0 rss: 65Mb 00:08:00.394 [2024-07-23 05:07:31.362132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.394 [2024-07-23 05:07:31.362165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.394 #3 NEW cov: 11601 ft: 11976 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:08:00.394 [2024-07-23 05:07:31.422477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.394 [2024-07-23 05:07:31.422509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.394 [2024-07-23 05:07:31.422582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.394 [2024-07-23 05:07:31.422600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.394 #4 NEW cov: 11607 ft: 12963 corp: 3/4b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CrossOver- 00:08:00.394 [2024-07-23 05:07:31.482492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.394 [2024-07-23 05:07:31.482524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.654 #5 NEW cov: 11692 ft: 13152 corp: 4/5b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 CopyPart- 00:08:00.654 [2024-07-23 05:07:31.533195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.533227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.654 [2024-07-23 05:07:31.533297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.533314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.654 [2024-07-23 05:07:31.533381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.533398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.654 [2024-07-23 05:07:31.533469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.533486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.654 #6 NEW cov: 11692 ft: 13627 corp: 5/9b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 CrossOver- 00:08:00.654 [2024-07-23 05:07:31.592813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.592844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.654 #7 NEW cov: 11692 ft: 13706 corp: 6/10b lim: 5 exec/s: 0 rss: 66Mb L: 1/4 MS: 1 ChangeByte- 00:08:00.654 [2024-07-23 05:07:31.633494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.633525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.654 [2024-07-23 05:07:31.633594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.633612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.654 [2024-07-23 05:07:31.633677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.633697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.654 [2024-07-23 05:07:31.633763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.633780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.654 #8 NEW cov: 11692 ft: 13763 corp: 7/14b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:00.654 [2024-07-23 05:07:31.683250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.683282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.654 [2024-07-23 05:07:31.683349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.683368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.654 #9 NEW cov: 11692 ft: 13797 corp: 8/16b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 InsertByte- 00:08:00.654 [2024-07-23 05:07:31.743384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.743416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.654 [2024-07-23 05:07:31.743493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.654 [2024-07-23 05:07:31.743512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.914 #10 NEW cov: 11692 ft: 13827 corp: 9/18b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 InsertByte- 00:08:00.914 [2024-07-23 05:07:31.803401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:31.803433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.914 #11 NEW cov: 11692 ft: 13856 corp: 10/19b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ChangeBit- 00:08:00.914 [2024-07-23 05:07:31.854062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:31.854093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.914 [2024-07-23 05:07:31.854161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:31.854179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.914 [2024-07-23 05:07:31.854247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:31.854264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.914 [2024-07-23 05:07:31.854331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:31.854348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.914 #12 NEW cov: 11692 ft: 13883 corp: 11/23b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:00.914 [2024-07-23 05:07:31.894038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:31.894069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.914 [2024-07-23 05:07:31.894139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:31.894157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.914 [2024-07-23 05:07:31.894225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:31.894244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.914 #13 NEW cov: 11692 ft: 14075 corp: 12/26b lim: 5 exec/s: 0 rss: 67Mb L: 3/4 MS: 1 InsertByte- 00:08:00.914 [2024-07-23 05:07:31.954018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:31.954049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.914 [2024-07-23 05:07:31.954121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:31.954139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.914 #14 NEW cov: 11692 ft: 14091 corp: 13/28b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 CrossOver- 00:08:00.914 [2024-07-23 05:07:32.004123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:32.004155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.914 [2024-07-23 05:07:32.004226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.914 [2024-07-23 05:07:32.004244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.174 #15 NEW cov: 11692 ft: 14188 corp: 14/30b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 CopyPart- 00:08:01.174 [2024-07-23 05:07:32.054323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.174 [2024-07-23 05:07:32.054354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.174 [2024-07-23 05:07:32.054425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.174 [2024-07-23 05:07:32.054450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.174 #16 NEW cov: 11692 ft: 14202 corp: 15/32b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:01.174 [2024-07-23 05:07:32.114434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.174 [2024-07-23 05:07:32.114473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.174 [2024-07-23 05:07:32.114542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.174 [2024-07-23 05:07:32.114564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.174 #17 NEW cov: 11692 ft: 14206 corp: 16/34b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 CrossOver- 00:08:01.174 [2024-07-23 05:07:32.154768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.174 [2024-07-23 05:07:32.154800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.174 [2024-07-23 05:07:32.154870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.174 [2024-07-23 05:07:32.154888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.174 [2024-07-23 05:07:32.154958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.174 [2024-07-23 05:07:32.154976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.433 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.433 #18 NEW cov: 11715 ft: 14221 corp: 17/37b lim: 5 exec/s: 18 rss: 68Mb L: 3/4 MS: 1 ChangeBinInt- 00:08:01.433 [2024-07-23 05:07:32.475895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.433 [2024-07-23 05:07:32.475936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.433 [2024-07-23 05:07:32.476009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.433 [2024-07-23 05:07:32.476029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.433 [2024-07-23 05:07:32.476097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.433 [2024-07-23 05:07:32.476115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.433 [2024-07-23 05:07:32.476185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.433 [2024-07-23 05:07:32.476203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.433 #19 NEW cov: 11715 ft: 14241 corp: 18/41b lim: 5 exec/s: 19 rss: 68Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:01.692 [2024-07-23 05:07:32.535692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.535726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.693 [2024-07-23 05:07:32.535813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.535831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.693 [2024-07-23 05:07:32.535900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.535922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.693 #20 NEW cov: 11715 ft: 14280 corp: 19/44b lim: 5 exec/s: 20 rss: 68Mb L: 3/4 MS: 1 ChangeBit- 00:08:01.693 [2024-07-23 05:07:32.596301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.596333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.693 [2024-07-23 05:07:32.596417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.596436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.693 [2024-07-23 05:07:32.596511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.596529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.693 [2024-07-23 05:07:32.596598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.596616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.693 #21 NEW cov: 11715 ft: 14319 corp: 20/48b lim: 5 exec/s: 21 rss: 69Mb L: 4/4 MS: 1 CopyPart- 00:08:01.693 [2024-07-23 05:07:32.656227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.656260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.693 [2024-07-23 05:07:32.656330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.656348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.693 [2024-07-23 05:07:32.656418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.656435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.693 [2024-07-23 05:07:32.656509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.656526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.693 #22 NEW cov: 11715 ft: 14347 corp: 21/52b lim: 5 exec/s: 22 rss: 69Mb L: 4/4 MS: 1 ChangeBit- 00:08:01.693 [2024-07-23 05:07:32.716050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.716082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.693 [2024-07-23 05:07:32.716163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.716181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.693 #23 NEW cov: 11715 ft: 14383 corp: 22/54b lim: 5 exec/s: 23 rss: 69Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:01.693 [2024-07-23 05:07:32.756379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.756415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.693 [2024-07-23 05:07:32.756502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.756520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.693 [2024-07-23 05:07:32.756599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.693 [2024-07-23 05:07:32.756617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.952 #24 NEW cov: 11715 ft: 14410 corp: 23/57b lim: 5 exec/s: 24 rss: 69Mb L: 3/4 MS: 1 InsertByte- 00:08:01.952 [2024-07-23 05:07:32.816326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.952 [2024-07-23 05:07:32.816359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.952 [2024-07-23 05:07:32.816430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.952 [2024-07-23 05:07:32.816454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.952 #25 NEW cov: 11715 ft: 14419 corp: 24/59b lim: 5 exec/s: 25 rss: 69Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:01.952 [2024-07-23 05:07:32.876461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.952 [2024-07-23 05:07:32.876493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.952 [2024-07-23 05:07:32.876575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.952 [2024-07-23 05:07:32.876593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.952 #26 NEW cov: 11715 ft: 14432 corp: 25/61b lim: 5 exec/s: 26 rss: 69Mb L: 2/4 MS: 1 ChangeByte- 00:08:01.952 [2024-07-23 05:07:32.936668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.952 [2024-07-23 05:07:32.936699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.952 [2024-07-23 05:07:32.936768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.952 [2024-07-23 05:07:32.936786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.952 #27 NEW cov: 11715 ft: 14437 corp: 26/63b lim: 5 exec/s: 27 rss: 69Mb L: 2/4 MS: 1 EraseBytes- 00:08:01.952 [2024-07-23 05:07:32.986786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.952 [2024-07-23 05:07:32.986817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.952 [2024-07-23 05:07:32.986900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.952 [2024-07-23 05:07:32.986923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.952 #28 NEW cov: 11715 ft: 14456 corp: 27/65b lim: 5 exec/s: 28 rss: 69Mb L: 2/4 MS: 1 ChangeByte- 00:08:01.952 [2024-07-23 05:07:33.037125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.952 [2024-07-23 05:07:33.037157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.952 [2024-07-23 05:07:33.037229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.952 [2024-07-23 05:07:33.037248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.952 [2024-07-23 05:07:33.037330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.952 [2024-07-23 05:07:33.037348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.211 #29 NEW cov: 11715 ft: 14491 corp: 28/68b lim: 5 exec/s: 29 rss: 69Mb L: 3/4 MS: 1 InsertByte- 00:08:02.211 [2024-07-23 05:07:33.087097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.211 [2024-07-23 05:07:33.087128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.211 [2024-07-23 05:07:33.087211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.211 [2024-07-23 05:07:33.087229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.211 #30 NEW cov: 11715 ft: 14501 corp: 29/70b lim: 5 exec/s: 30 rss: 69Mb L: 2/4 MS: 1 CrossOver- 00:08:02.211 [2024-07-23 05:07:33.127225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.211 [2024-07-23 05:07:33.127256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.211 [2024-07-23 05:07:33.127339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.211 [2024-07-23 05:07:33.127357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.211 #31 NEW cov: 11715 ft: 14517 corp: 30/72b lim: 5 exec/s: 31 rss: 69Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:02.211 [2024-07-23 05:07:33.187582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.211 [2024-07-23 05:07:33.187613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.211 [2024-07-23 05:07:33.187683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.211 [2024-07-23 05:07:33.187701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.211 [2024-07-23 05:07:33.187771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.211 [2024-07-23 05:07:33.187788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.211 #32 NEW cov: 11715 ft: 14555 corp: 31/75b lim: 5 exec/s: 32 rss: 69Mb L: 3/4 MS: 1 InsertByte- 00:08:02.211 [2024-07-23 05:07:33.237502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.211 [2024-07-23 05:07:33.237535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.211 [2024-07-23 05:07:33.237604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.211 [2024-07-23 05:07:33.237622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.211 #33 NEW cov: 11715 ft: 14566 corp: 32/77b lim: 5 exec/s: 33 rss: 69Mb L: 2/4 MS: 1 EraseBytes- 00:08:02.212 [2024-07-23 05:07:33.297881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.212 [2024-07-23 05:07:33.297913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.212 [2024-07-23 05:07:33.297996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.212 [2024-07-23 05:07:33.298015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.212 [2024-07-23 05:07:33.298084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.212 [2024-07-23 05:07:33.298102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.470 #34 NEW cov: 11715 ft: 14568 corp: 33/80b lim: 5 exec/s: 17 rss: 69Mb L: 3/4 MS: 1 CrossOver- 00:08:02.470 #34 DONE cov: 11715 ft: 14568 corp: 33/80b lim: 5 exec/s: 17 rss: 69Mb 00:08:02.470 Done 34 runs in 2 second(s) 00:08:02.470 05:07:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:08:02.470 05:07:33 -- ../common.sh@72 -- # (( i++ )) 00:08:02.470 05:07:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.470 05:07:33 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:02.470 05:07:33 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:02.470 05:07:33 -- nvmf/run.sh@24 -- # local timen=1 00:08:02.470 05:07:33 -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.470 05:07:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:02.470 05:07:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:02.470 05:07:33 -- nvmf/run.sh@29 -- # printf %02d 9 00:08:02.470 05:07:33 -- nvmf/run.sh@29 -- # port=4409 00:08:02.470 05:07:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:02.470 05:07:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:02.470 05:07:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.470 05:07:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:08:02.470 [2024-07-23 05:07:33.504208] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:02.470 [2024-07-23 05:07:33.504278] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3142910 ] 00:08:02.470 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.729 [2024-07-23 05:07:33.719475] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.729 [2024-07-23 05:07:33.794530] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.729 [2024-07-23 05:07:33.794703] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.988 [2024-07-23 05:07:33.855676] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.988 [2024-07-23 05:07:33.872014] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:02.988 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.988 INFO: Seed: 209993897 00:08:02.988 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:02.988 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:02.988 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:02.988 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.988 [2024-07-23 05:07:33.948348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.988 [2024-07-23 05:07:33.948390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.988 #2 INITED cov: 11488 ft: 11489 corp: 1/1b exec/s: 0 rss: 65Mb 00:08:02.988 [2024-07-23 05:07:33.998302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.988 [2024-07-23 05:07:33.998338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.988 #3 NEW cov: 11601 ft: 11916 corp: 2/2b lim: 5 exec/s: 0 rss: 65Mb L: 1/1 MS: 1 ShuffleBytes- 00:08:02.988 [2024-07-23 05:07:34.068775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.988 [2024-07-23 05:07:34.068812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.988 [2024-07-23 05:07:34.068942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.988 [2024-07-23 05:07:34.068964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.247 #4 NEW cov: 11607 ft: 12783 corp: 3/4b lim: 5 exec/s: 0 rss: 65Mb L: 2/2 MS: 1 CrossOver- 00:08:03.247 [2024-07-23 05:07:34.128736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.247 [2024-07-23 05:07:34.128772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.247 #5 NEW cov: 11692 ft: 13033 corp: 4/5b lim: 5 exec/s: 0 rss: 65Mb L: 1/2 MS: 1 ChangeByte- 00:08:03.247 [2024-07-23 05:07:34.188900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.247 [2024-07-23 05:07:34.188936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.247 #6 NEW cov: 11692 ft: 13125 corp: 5/6b lim: 5 exec/s: 0 rss: 65Mb L: 1/2 MS: 1 EraseBytes- 00:08:03.247 [2024-07-23 05:07:34.259088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.247 [2024-07-23 05:07:34.259123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.247 #7 NEW cov: 11692 ft: 13193 corp: 6/7b lim: 5 exec/s: 0 rss: 65Mb L: 1/2 MS: 1 ChangeASCIIInt- 00:08:03.247 [2024-07-23 05:07:34.329349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.247 [2024-07-23 05:07:34.329386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.506 #8 NEW cov: 11692 ft: 13272 corp: 7/8b lim: 5 exec/s: 0 rss: 65Mb L: 1/2 MS: 1 ChangeByte- 00:08:03.506 [2024-07-23 05:07:34.400711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.506 [2024-07-23 05:07:34.400746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.506 [2024-07-23 05:07:34.400879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.506 [2024-07-23 05:07:34.400901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.506 [2024-07-23 05:07:34.401029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.506 [2024-07-23 05:07:34.401051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.506 [2024-07-23 05:07:34.401181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.506 [2024-07-23 05:07:34.401206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.506 [2024-07-23 05:07:34.401327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.506 [2024-07-23 05:07:34.401346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.506 #9 NEW cov: 11692 ft: 13659 corp: 8/13b lim: 5 exec/s: 0 rss: 65Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:03.506 [2024-07-23 05:07:34.460352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.506 [2024-07-23 05:07:34.460387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.506 [2024-07-23 05:07:34.460492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.506 [2024-07-23 05:07:34.460514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.507 [2024-07-23 05:07:34.460654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.507 [2024-07-23 05:07:34.460675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.507 #10 NEW cov: 11692 ft: 13960 corp: 9/16b lim: 5 exec/s: 0 rss: 66Mb L: 3/5 MS: 1 EraseBytes- 00:08:03.507 [2024-07-23 05:07:34.530871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.507 [2024-07-23 05:07:34.530904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.507 [2024-07-23 05:07:34.531027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.507 [2024-07-23 05:07:34.531050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.507 [2024-07-23 05:07:34.531174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.507 [2024-07-23 05:07:34.531197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.507 [2024-07-23 05:07:34.531314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.507 [2024-07-23 05:07:34.531336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.507 #11 NEW cov: 11692 ft: 14060 corp: 10/20b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 CMP- DE: "\377\377"- 00:08:03.507 [2024-07-23 05:07:34.590105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.507 [2024-07-23 05:07:34.590140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.765 #12 NEW cov: 11692 ft: 14201 corp: 11/21b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 CopyPart- 00:08:03.765 [2024-07-23 05:07:34.660360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.765 [2024-07-23 05:07:34.660395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.765 #13 NEW cov: 11692 ft: 14206 corp: 12/22b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:03.765 [2024-07-23 05:07:34.711698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.765 [2024-07-23 05:07:34.711732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.765 [2024-07-23 05:07:34.711867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.765 [2024-07-23 05:07:34.711890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.765 [2024-07-23 05:07:34.712019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.765 [2024-07-23 05:07:34.712041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.765 [2024-07-23 05:07:34.712171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.765 [2024-07-23 05:07:34.712193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.765 [2024-07-23 05:07:34.712321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.765 [2024-07-23 05:07:34.712342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.765 #14 NEW cov: 11692 ft: 14242 corp: 13/27b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 CopyPart- 00:08:03.765 [2024-07-23 05:07:34.770667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.765 [2024-07-23 05:07:34.770702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.332 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:04.332 #15 NEW cov: 11715 ft: 14330 corp: 14/28b lim: 5 exec/s: 15 rss: 67Mb L: 1/5 MS: 1 ChangeBit- 00:08:04.332 [2024-07-23 05:07:35.223494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.332 [2024-07-23 05:07:35.223541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.332 [2024-07-23 05:07:35.223699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.332 [2024-07-23 05:07:35.223720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.332 [2024-07-23 05:07:35.223869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.332 [2024-07-23 05:07:35.223892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.332 [2024-07-23 05:07:35.224045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.332 [2024-07-23 05:07:35.224068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.333 [2024-07-23 05:07:35.224220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.333 [2024-07-23 05:07:35.224243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.333 #16 NEW cov: 11715 ft: 14437 corp: 15/33b lim: 5 exec/s: 16 rss: 67Mb L: 5/5 MS: 1 InsertByte- 00:08:04.333 [2024-07-23 05:07:35.292787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.333 [2024-07-23 05:07:35.292822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.333 [2024-07-23 05:07:35.292965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.333 [2024-07-23 05:07:35.292990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.333 #17 NEW cov: 11715 ft: 14467 corp: 16/35b lim: 5 exec/s: 17 rss: 67Mb L: 2/5 MS: 1 CopyPart- 00:08:04.333 [2024-07-23 05:07:35.353840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.333 [2024-07-23 05:07:35.353874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.333 [2024-07-23 05:07:35.354021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.333 [2024-07-23 05:07:35.354045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.333 [2024-07-23 05:07:35.354195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.333 [2024-07-23 05:07:35.354219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.333 [2024-07-23 05:07:35.354365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.333 [2024-07-23 05:07:35.354386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.333 #18 NEW cov: 11715 ft: 14491 corp: 17/39b lim: 5 exec/s: 18 rss: 67Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:04.333 [2024-07-23 05:07:35.423449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.333 [2024-07-23 05:07:35.423489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.333 [2024-07-23 05:07:35.423651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.333 [2024-07-23 05:07:35.423673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.592 #19 NEW cov: 11715 ft: 14519 corp: 18/41b lim: 5 exec/s: 19 rss: 67Mb L: 2/5 MS: 1 PersAutoDict- DE: "\377\377"- 00:08:04.592 [2024-07-23 05:07:35.483488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.483525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.592 #20 NEW cov: 11715 ft: 14536 corp: 19/42b lim: 5 exec/s: 20 rss: 67Mb L: 1/5 MS: 1 ChangeByte- 00:08:04.592 [2024-07-23 05:07:35.534883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.534918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.592 [2024-07-23 05:07:35.535059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.535081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.592 [2024-07-23 05:07:35.535223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.535246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.592 [2024-07-23 05:07:35.535392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.535418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.592 #21 NEW cov: 11715 ft: 14559 corp: 20/46b lim: 5 exec/s: 21 rss: 67Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:04.592 [2024-07-23 05:07:35.595363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.595399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.592 [2024-07-23 05:07:35.595547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.595570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.592 [2024-07-23 05:07:35.595717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.595740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.592 [2024-07-23 05:07:35.595885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.595908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.592 #22 NEW cov: 11715 ft: 14675 corp: 21/50b lim: 5 exec/s: 22 rss: 67Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:04.592 [2024-07-23 05:07:35.655113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.655149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.592 [2024-07-23 05:07:35.655292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.655314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.592 [2024-07-23 05:07:35.655471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.592 [2024-07-23 05:07:35.655493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.851 #23 NEW cov: 11715 ft: 14702 corp: 22/53b lim: 5 exec/s: 23 rss: 68Mb L: 3/5 MS: 1 CrossOver- 00:08:04.851 [2024-07-23 05:07:35.724756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.851 [2024-07-23 05:07:35.724791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.851 #24 NEW cov: 11715 ft: 14712 corp: 23/54b lim: 5 exec/s: 24 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:08:04.851 [2024-07-23 05:07:35.795839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.851 [2024-07-23 05:07:35.795875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.851 [2024-07-23 05:07:35.796028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.851 [2024-07-23 05:07:35.796051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.851 [2024-07-23 05:07:35.796181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.851 [2024-07-23 05:07:35.796201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.851 #25 NEW cov: 11715 ft: 14807 corp: 24/57b lim: 5 exec/s: 25 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:08:04.851 [2024-07-23 05:07:35.865330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.851 [2024-07-23 05:07:35.865365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.851 #26 NEW cov: 11715 ft: 14818 corp: 25/58b lim: 5 exec/s: 26 rss: 68Mb L: 1/5 MS: 1 ChangeBinInt- 00:08:04.851 [2024-07-23 05:07:35.925942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.851 [2024-07-23 05:07:35.925979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.851 [2024-07-23 05:07:35.926119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.851 [2024-07-23 05:07:35.926145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.111 #27 NEW cov: 11715 ft: 14877 corp: 26/60b lim: 5 exec/s: 13 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:08:05.111 #27 DONE cov: 11715 ft: 14877 corp: 26/60b lim: 5 exec/s: 13 rss: 68Mb 00:08:05.111 ###### Recommended dictionary. ###### 00:08:05.111 "\377\377" # Uses: 1 00:08:05.111 ###### End of recommended dictionary. ###### 00:08:05.111 Done 27 runs in 2 second(s) 00:08:05.111 05:07:36 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:08:05.111 05:07:36 -- ../common.sh@72 -- # (( i++ )) 00:08:05.111 05:07:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.111 05:07:36 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:05.111 05:07:36 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:05.111 05:07:36 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.111 05:07:36 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.111 05:07:36 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:05.111 05:07:36 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:05.111 05:07:36 -- nvmf/run.sh@29 -- # printf %02d 10 00:08:05.111 05:07:36 -- nvmf/run.sh@29 -- # port=4410 00:08:05.111 05:07:36 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:05.111 05:07:36 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:05.111 05:07:36 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.111 05:07:36 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:08:05.111 [2024-07-23 05:07:36.134284] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:05.111 [2024-07-23 05:07:36.134351] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3143447 ] 00:08:05.111 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.370 [2024-07-23 05:07:36.346490] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.370 [2024-07-23 05:07:36.422033] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.370 [2024-07-23 05:07:36.422205] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.630 [2024-07-23 05:07:36.483167] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.630 [2024-07-23 05:07:36.499487] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:05.630 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.630 INFO: Seed: 2835999099 00:08:05.630 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:05.630 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:05.630 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:05.630 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.630 #2 INITED exec/s: 0 rss: 60Mb 00:08:05.630 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.630 This may also happen if the target rejected all inputs we tried so far 00:08:05.630 [2024-07-23 05:07:36.555385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.630 [2024-07-23 05:07:36.555422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.630 [2024-07-23 05:07:36.555496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.630 [2024-07-23 05:07:36.555515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.630 [2024-07-23 05:07:36.555582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.630 [2024-07-23 05:07:36.555603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.630 [2024-07-23 05:07:36.555671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.630 [2024-07-23 05:07:36.555689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.889 NEW_FUNC[1/669]: 0x48daf0 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:05.889 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.889 #3 NEW cov: 11502 ft: 11512 corp: 2/38b lim: 40 exec/s: 0 rss: 67Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:06.148 [2024-07-23 05:07:36.996467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:36.996509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:36.996595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:36.996614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:36.996688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:36.996706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.148 NEW_FUNC[1/1]: 0x124a950 in nvmf_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/nvmf.c:153 00:08:06.148 #4 NEW cov: 11624 ft: 12497 corp: 3/62b lim: 40 exec/s: 0 rss: 67Mb L: 24/37 MS: 1 EraseBytes- 00:08:06.148 [2024-07-23 05:07:37.056486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.056519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:37.056599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.056618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:37.056696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.056715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.148 #10 NEW cov: 11630 ft: 12801 corp: 4/86b lim: 40 exec/s: 0 rss: 67Mb L: 24/37 MS: 1 ShuffleBytes- 00:08:06.148 [2024-07-23 05:07:37.116748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.116780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:37.116857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.116876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:37.116954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:43474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.116973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.148 #11 NEW cov: 11715 ft: 13021 corp: 5/110b lim: 40 exec/s: 0 rss: 67Mb L: 24/37 MS: 1 ChangeBit- 00:08:06.148 [2024-07-23 05:07:37.177058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0effffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.177090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:37.177165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.177183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:37.177258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.177275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:37.177352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.177370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.148 #14 NEW cov: 11715 ft: 13181 corp: 6/143b lim: 40 exec/s: 0 rss: 67Mb L: 33/37 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:06.148 [2024-07-23 05:07:37.217206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.217239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:37.217319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47d1d1d1 cdw11:d1d1d1d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.217338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:37.217414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d1d1d1d1 cdw11:d1474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.217432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.148 [2024-07-23 05:07:37.217516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.148 [2024-07-23 05:07:37.217534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.407 #15 NEW cov: 11715 ft: 13271 corp: 7/179b lim: 40 exec/s: 0 rss: 67Mb L: 36/37 MS: 1 InsertRepeatedBytes- 00:08:06.407 [2024-07-23 05:07:37.266832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:6a0a4747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.266865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.407 #18 NEW cov: 11715 ft: 13708 corp: 8/188b lim: 40 exec/s: 0 rss: 68Mb L: 9/37 MS: 3 ChangeBit-ChangeBit-CrossOver- 00:08:06.407 [2024-07-23 05:07:37.317641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:7a7a7a7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.317679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.407 [2024-07-23 05:07:37.317737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47d1d1d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.317756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.407 [2024-07-23 05:07:37.317831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d1d1d1d1 cdw11:d1d1d1d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.317850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.407 [2024-07-23 05:07:37.317922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:d1474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.317940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.407 [2024-07-23 05:07:37.318014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.318032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.407 #19 NEW cov: 11715 ft: 13799 corp: 9/228b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:06.407 [2024-07-23 05:07:37.377471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.377504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.407 [2024-07-23 05:07:37.377581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.377600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.407 [2024-07-23 05:07:37.377673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:473a4747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.377691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.407 #20 NEW cov: 11715 ft: 13809 corp: 10/252b lim: 40 exec/s: 0 rss: 68Mb L: 24/40 MS: 1 ChangeByte- 00:08:06.407 [2024-07-23 05:07:37.427584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a471e00 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.427616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.407 [2024-07-23 05:07:37.427696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.427715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.407 [2024-07-23 05:07:37.427796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:4747473a cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.427815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.407 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.407 #21 NEW cov: 11738 ft: 13886 corp: 11/278b lim: 40 exec/s: 0 rss: 68Mb L: 26/40 MS: 1 CMP- DE: "\036\000"- 00:08:06.407 [2024-07-23 05:07:37.487763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a471e00 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.487795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.407 [2024-07-23 05:07:37.487875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:3a474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.487894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.407 [2024-07-23 05:07:37.487970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:4747473a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.407 [2024-07-23 05:07:37.487989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.666 #22 NEW cov: 11738 ft: 13916 corp: 12/308b lim: 40 exec/s: 0 rss: 68Mb L: 30/40 MS: 1 CopyPart- 00:08:06.666 [2024-07-23 05:07:37.548018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.548051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.666 [2024-07-23 05:07:37.548130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.548149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.666 [2024-07-23 05:07:37.548225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.548243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.666 #23 NEW cov: 11738 ft: 13923 corp: 13/332b lim: 40 exec/s: 23 rss: 68Mb L: 24/40 MS: 1 CopyPart- 00:08:06.666 [2024-07-23 05:07:37.588273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0effffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.588305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.666 [2024-07-23 05:07:37.588384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.588403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.666 [2024-07-23 05:07:37.588474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.588493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.666 [2024-07-23 05:07:37.588566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:dfffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.588584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.666 #24 NEW cov: 11738 ft: 13966 corp: 14/365b lim: 40 exec/s: 24 rss: 68Mb L: 33/40 MS: 1 ChangeBit- 00:08:06.666 [2024-07-23 05:07:37.648586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474745 cdw11:7a7a7a7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.648621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.666 [2024-07-23 05:07:37.648700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47d1d1d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.648719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.666 [2024-07-23 05:07:37.648790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d1d1d1d1 cdw11:d1d1d1d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.648808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.666 [2024-07-23 05:07:37.648878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:d1474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.648896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.666 [2024-07-23 05:07:37.648967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.648985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.666 #25 NEW cov: 11738 ft: 14011 corp: 15/405b lim: 40 exec/s: 25 rss: 68Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:06.666 [2024-07-23 05:07:37.708126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.666 [2024-07-23 05:07:37.708158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.666 #31 NEW cov: 11738 ft: 14049 corp: 16/419b lim: 40 exec/s: 31 rss: 68Mb L: 14/40 MS: 1 EraseBytes- 00:08:06.925 [2024-07-23 05:07:37.768620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.925 [2024-07-23 05:07:37.768653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.925 [2024-07-23 05:07:37.768730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.925 [2024-07-23 05:07:37.768749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.925 [2024-07-23 05:07:37.768825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.925 [2024-07-23 05:07:37.768843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.925 #32 NEW cov: 11738 ft: 14064 corp: 17/443b lim: 40 exec/s: 32 rss: 69Mb L: 24/40 MS: 1 ShuffleBytes- 00:08:06.925 [2024-07-23 05:07:37.828945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a471e00 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.925 [2024-07-23 05:07:37.828976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.925 [2024-07-23 05:07:37.829050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:3a474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.925 [2024-07-23 05:07:37.829068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.925 [2024-07-23 05:07:37.829142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:4747473a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.925 [2024-07-23 05:07:37.829164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.925 [2024-07-23 05:07:37.829235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:1e004747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.925 [2024-07-23 05:07:37.829253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.925 #33 NEW cov: 11738 ft: 14084 corp: 18/475b lim: 40 exec/s: 33 rss: 69Mb L: 32/40 MS: 1 PersAutoDict- DE: "\036\000"- 00:08:06.925 [2024-07-23 05:07:37.878932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a471e00 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.925 [2024-07-23 05:07:37.878965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.925 [2024-07-23 05:07:37.879045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.925 [2024-07-23 05:07:37.879064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.926 [2024-07-23 05:07:37.879138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:4747473a cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.926 [2024-07-23 05:07:37.879156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.926 #34 NEW cov: 11738 ft: 14102 corp: 19/501b lim: 40 exec/s: 34 rss: 69Mb L: 26/40 MS: 1 CopyPart- 00:08:06.926 [2024-07-23 05:07:37.919107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.926 [2024-07-23 05:07:37.919138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.926 [2024-07-23 05:07:37.919219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.926 [2024-07-23 05:07:37.919238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.926 [2024-07-23 05:07:37.919315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:a7474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.926 [2024-07-23 05:07:37.919333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.926 #35 NEW cov: 11738 ft: 14137 corp: 20/526b lim: 40 exec/s: 35 rss: 69Mb L: 25/40 MS: 1 InsertByte- 00:08:06.926 [2024-07-23 05:07:37.959186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a471e00 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.926 [2024-07-23 05:07:37.959218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.926 [2024-07-23 05:07:37.959296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.926 [2024-07-23 05:07:37.959315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.926 [2024-07-23 05:07:37.959393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a474747 cdw11:4747473a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.926 [2024-07-23 05:07:37.959412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.926 #36 NEW cov: 11738 ft: 14172 corp: 21/556b lim: 40 exec/s: 36 rss: 69Mb L: 30/40 MS: 1 CrossOver- 00:08:06.926 [2024-07-23 05:07:37.998945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a280a00 cdw11:00001e00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.926 [2024-07-23 05:07:37.998978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.185 #40 NEW cov: 11738 ft: 14255 corp: 22/565b lim: 40 exec/s: 40 rss: 69Mb L: 9/40 MS: 4 CrossOver-InsertByte-CMP-PersAutoDict- DE: "\000\000\000\000"-"\036\000"- 00:08:07.185 [2024-07-23 05:07:38.049136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:b9b8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-23 05:07:38.049168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.185 #41 NEW cov: 11738 ft: 14262 corp: 23/579b lim: 40 exec/s: 41 rss: 69Mb L: 14/40 MS: 1 ChangeBinInt- 00:08:07.185 [2024-07-23 05:07:38.109649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-23 05:07:38.109681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.185 [2024-07-23 05:07:38.109770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-23 05:07:38.109789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.185 [2024-07-23 05:07:38.109861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-23 05:07:38.109880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.185 #42 NEW cov: 11738 ft: 14267 corp: 24/603b lim: 40 exec/s: 42 rss: 69Mb L: 24/40 MS: 1 ShuffleBytes- 00:08:07.185 [2024-07-23 05:07:38.149757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a471e00 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-23 05:07:38.149789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.185 [2024-07-23 05:07:38.149865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-23 05:07:38.149884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.185 [2024-07-23 05:07:38.149958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:4747473a cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-23 05:07:38.149976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.185 #43 NEW cov: 11738 ft: 14274 corp: 25/629b lim: 40 exec/s: 43 rss: 69Mb L: 26/40 MS: 1 CrossOver- 00:08:07.185 [2024-07-23 05:07:38.209921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-23 05:07:38.209953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.185 [2024-07-23 05:07:38.210029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-23 05:07:38.210047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.185 [2024-07-23 05:07:38.210119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-23 05:07:38.210141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.185 #45 NEW cov: 11738 ft: 14275 corp: 26/658b lim: 40 exec/s: 45 rss: 69Mb L: 29/40 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:07.185 [2024-07-23 05:07:38.269959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a471e00 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.185 [2024-07-23 05:07:38.269991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.185 [2024-07-23 05:07:38.270064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:4747473a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.186 [2024-07-23 05:07:38.270084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.445 #46 NEW cov: 11738 ft: 14468 corp: 27/680b lim: 40 exec/s: 46 rss: 69Mb L: 22/40 MS: 1 EraseBytes- 00:08:07.445 [2024-07-23 05:07:38.320596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a474747 cdw11:7a7a7a7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.320627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.445 [2024-07-23 05:07:38.320703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47d1d1d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.320722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.445 [2024-07-23 05:07:38.320798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d1d1d1d1 cdw11:d1d1d1d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.320816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.445 [2024-07-23 05:07:38.320887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:47d14747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.320905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.445 [2024-07-23 05:07:38.320973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.320992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.445 #47 NEW cov: 11738 ft: 14482 corp: 28/720b lim: 40 exec/s: 47 rss: 69Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:07.445 [2024-07-23 05:07:38.360528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0effffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.360559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.445 [2024-07-23 05:07:38.360649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.360668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.445 [2024-07-23 05:07:38.360740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.360758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.445 [2024-07-23 05:07:38.360832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffdfffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.360850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.445 #53 NEW cov: 11738 ft: 14510 corp: 29/753b lim: 40 exec/s: 53 rss: 69Mb L: 33/40 MS: 1 ShuffleBytes- 00:08:07.445 [2024-07-23 05:07:38.420395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:6a0a4747 cdw11:0a474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.420427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.445 [2024-07-23 05:07:38.420510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:b9b8b8b8 cdw11:b8b8b8b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.420530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.445 #54 NEW cov: 11738 ft: 14528 corp: 30/775b lim: 40 exec/s: 54 rss: 69Mb L: 22/40 MS: 1 CrossOver- 00:08:07.445 [2024-07-23 05:07:38.480893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a471e00 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.480926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.445 [2024-07-23 05:07:38.481006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.481025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.445 [2024-07-23 05:07:38.481101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a474747 cdw11:4747473a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.481120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.445 [2024-07-23 05:07:38.481191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:47474747 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.481210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.445 #55 NEW cov: 11738 ft: 14534 corp: 31/809b lim: 40 exec/s: 55 rss: 69Mb L: 34/40 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:07.445 [2024-07-23 05:07:38.530554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0e280a00 cdw11:00001e00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.445 [2024-07-23 05:07:38.530587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.704 #56 NEW cov: 11738 ft: 14544 corp: 32/818b lim: 40 exec/s: 28 rss: 69Mb L: 9/40 MS: 1 ChangeBinInt- 00:08:07.704 #56 DONE cov: 11738 ft: 14544 corp: 32/818b lim: 40 exec/s: 28 rss: 69Mb 00:08:07.704 ###### Recommended dictionary. ###### 00:08:07.704 "\036\000" # Uses: 2 00:08:07.704 "\000\000\000\000" # Uses: 1 00:08:07.704 ###### End of recommended dictionary. ###### 00:08:07.704 Done 56 runs in 2 second(s) 00:08:07.704 05:07:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:08:07.704 05:07:38 -- ../common.sh@72 -- # (( i++ )) 00:08:07.704 05:07:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.704 05:07:38 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:07.704 05:07:38 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:07.704 05:07:38 -- nvmf/run.sh@24 -- # local timen=1 00:08:07.704 05:07:38 -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.704 05:07:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:07.704 05:07:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:07.704 05:07:38 -- nvmf/run.sh@29 -- # printf %02d 11 00:08:07.704 05:07:38 -- nvmf/run.sh@29 -- # port=4411 00:08:07.704 05:07:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:07.704 05:07:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:07.704 05:07:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.704 05:07:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:08:07.704 [2024-07-23 05:07:38.743818] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:07.704 [2024-07-23 05:07:38.743889] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3143848 ] 00:08:07.704 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.963 [2024-07-23 05:07:38.965777] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.963 [2024-07-23 05:07:39.041124] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:07.963 [2024-07-23 05:07:39.041297] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.222 [2024-07-23 05:07:39.102651] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.222 [2024-07-23 05:07:39.119012] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:08.222 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.222 INFO: Seed: 1161033797 00:08:08.222 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:08.222 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:08.222 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:08.222 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.222 #2 INITED exec/s: 0 rss: 60Mb 00:08:08.222 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.222 This may also happen if the target rejected all inputs we tried so far 00:08:08.222 [2024-07-23 05:07:39.174450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:022e2e2e cdw11:2e2e2e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.222 [2024-07-23 05:07:39.174487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.790 NEW_FUNC[1/671]: 0x48f860 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:08.790 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.791 #7 NEW cov: 11523 ft: 11524 corp: 2/9b lim: 40 exec/s: 0 rss: 66Mb L: 8/8 MS: 5 ChangeBit-CopyPart-CrossOver-InsertRepeatedBytes-CopyPart- 00:08:08.791 [2024-07-23 05:07:39.615500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:2e2e2e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.791 [2024-07-23 05:07:39.615540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.791 #8 NEW cov: 11636 ft: 12070 corp: 3/17b lim: 40 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:08.791 [2024-07-23 05:07:39.675595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:022e2e2e cdw11:2eb32e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.791 [2024-07-23 05:07:39.675628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.791 #9 NEW cov: 11642 ft: 12408 corp: 4/25b lim: 40 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 ChangeByte- 00:08:08.791 [2024-07-23 05:07:39.715987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2eca cdw11:cacacaca SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.791 [2024-07-23 05:07:39.716019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.791 [2024-07-23 05:07:39.716093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:cacacaca SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.791 [2024-07-23 05:07:39.716112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.791 [2024-07-23 05:07:39.716181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:cacacaca SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.791 [2024-07-23 05:07:39.716200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.791 #11 NEW cov: 11727 ft: 13321 corp: 5/50b lim: 40 exec/s: 0 rss: 67Mb L: 25/25 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:08.791 [2024-07-23 05:07:39.775883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.791 [2024-07-23 05:07:39.775914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.791 #12 NEW cov: 11727 ft: 13378 corp: 6/58b lim: 40 exec/s: 0 rss: 67Mb L: 8/25 MS: 1 ChangeBinInt- 00:08:08.791 [2024-07-23 05:07:39.826022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e022e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.791 [2024-07-23 05:07:39.826053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.791 #13 NEW cov: 11727 ft: 13409 corp: 7/66b lim: 40 exec/s: 0 rss: 67Mb L: 8/25 MS: 1 ShuffleBytes- 00:08:08.791 [2024-07-23 05:07:39.866663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.791 [2024-07-23 05:07:39.866694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.791 [2024-07-23 05:07:39.866762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.791 [2024-07-23 05:07:39.866780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.791 [2024-07-23 05:07:39.866846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.791 [2024-07-23 05:07:39.866864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.791 [2024-07-23 05:07:39.866925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.791 [2024-07-23 05:07:39.866943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.050 #15 NEW cov: 11727 ft: 13770 corp: 8/105b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:09.050 [2024-07-23 05:07:39.916664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00efefef cdw11:efefefef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.050 [2024-07-23 05:07:39.916695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.050 [2024-07-23 05:07:39.916769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efefefef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.050 [2024-07-23 05:07:39.916792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.050 [2024-07-23 05:07:39.916863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ef000008 cdw11:2e2e2e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.050 [2024-07-23 05:07:39.916881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.050 #16 NEW cov: 11727 ft: 13800 corp: 9/129b lim: 40 exec/s: 0 rss: 67Mb L: 24/39 MS: 1 InsertRepeatedBytes- 00:08:09.050 [2024-07-23 05:07:39.966410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e032e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.050 [2024-07-23 05:07:39.966449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.050 #17 NEW cov: 11727 ft: 13904 corp: 10/137b lim: 40 exec/s: 0 rss: 67Mb L: 8/39 MS: 1 ChangeBit- 00:08:09.050 [2024-07-23 05:07:40.016575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02ae2e2e cdw11:2eb32e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.050 [2024-07-23 05:07:40.016607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.050 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:09.050 #18 NEW cov: 11750 ft: 13928 corp: 11/145b lim: 40 exec/s: 0 rss: 68Mb L: 8/39 MS: 1 ChangeBit- 00:08:09.050 [2024-07-23 05:07:40.077104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00efefef cdw11:efefefef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.050 [2024-07-23 05:07:40.077139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.050 [2024-07-23 05:07:40.077211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efefef5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.050 [2024-07-23 05:07:40.077230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.050 [2024-07-23 05:07:40.077295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:efef0000 cdw11:082e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.050 [2024-07-23 05:07:40.077311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.050 #19 NEW cov: 11750 ft: 13986 corp: 12/170b lim: 40 exec/s: 0 rss: 68Mb L: 25/39 MS: 1 InsertByte- 00:08:09.050 [2024-07-23 05:07:40.136953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e032e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.050 [2024-07-23 05:07:40.136988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.309 #20 NEW cov: 11750 ft: 14007 corp: 13/184b lim: 40 exec/s: 20 rss: 69Mb L: 14/39 MS: 1 CopyPart- 00:08:09.309 [2024-07-23 05:07:40.187208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:2e2e2e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.309 [2024-07-23 05:07:40.187240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.309 [2024-07-23 05:07:40.187310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ebebebeb cdw11:ebebebeb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.309 [2024-07-23 05:07:40.187330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.309 #21 NEW cov: 11750 ft: 14225 corp: 14/206b lim: 40 exec/s: 21 rss: 69Mb L: 22/39 MS: 1 InsertRepeatedBytes- 00:08:09.309 [2024-07-23 05:07:40.237171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a2e022e cdw11:2e2e2eb3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.309 [2024-07-23 05:07:40.237207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.309 #22 NEW cov: 11750 ft: 14271 corp: 15/214b lim: 40 exec/s: 22 rss: 69Mb L: 8/39 MS: 1 ShuffleBytes- 00:08:09.309 [2024-07-23 05:07:40.277304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a5d2e23 cdw11:0a2e022e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.309 [2024-07-23 05:07:40.277335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.309 #26 NEW cov: 11750 ft: 14285 corp: 16/222b lim: 40 exec/s: 26 rss: 69Mb L: 8/39 MS: 4 EraseBytes-CopyPart-InsertByte-InsertByte- 00:08:09.309 [2024-07-23 05:07:40.327434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:53ae2e2e cdw11:0a022e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.309 [2024-07-23 05:07:40.327471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.309 #29 NEW cov: 11750 ft: 14302 corp: 17/235b lim: 40 exec/s: 29 rss: 69Mb L: 13/39 MS: 3 EraseBytes-ChangeByte-CrossOver- 00:08:09.309 [2024-07-23 05:07:40.377598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e092e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.309 [2024-07-23 05:07:40.377630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.568 #30 NEW cov: 11750 ft: 14320 corp: 18/243b lim: 40 exec/s: 30 rss: 69Mb L: 8/39 MS: 1 ChangeBinInt- 00:08:09.568 [2024-07-23 05:07:40.417713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.568 [2024-07-23 05:07:40.417746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.569 #31 NEW cov: 11750 ft: 14353 corp: 19/251b lim: 40 exec/s: 31 rss: 69Mb L: 8/39 MS: 1 CopyPart- 00:08:09.569 [2024-07-23 05:07:40.457831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2eff02ae cdw11:2e2e2eb3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.569 [2024-07-23 05:07:40.457863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.569 #35 NEW cov: 11750 ft: 14401 corp: 20/261b lim: 40 exec/s: 35 rss: 69Mb L: 10/39 MS: 4 InsertByte-CrossOver-ChangeByte-CrossOver- 00:08:09.569 [2024-07-23 05:07:40.497927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:33080000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.569 [2024-07-23 05:07:40.497958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.569 #36 NEW cov: 11750 ft: 14417 corp: 21/270b lim: 40 exec/s: 36 rss: 69Mb L: 9/39 MS: 1 InsertByte- 00:08:09.569 [2024-07-23 05:07:40.548277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00efefef cdw11:efefefef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.569 [2024-07-23 05:07:40.548309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.569 [2024-07-23 05:07:40.548376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:0000082e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.569 [2024-07-23 05:07:40.548395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.569 #37 NEW cov: 11750 ft: 14423 corp: 22/289b lim: 40 exec/s: 37 rss: 69Mb L: 19/39 MS: 1 EraseBytes- 00:08:09.569 [2024-07-23 05:07:40.608380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:53ae2e2e cdw11:0a022e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.569 [2024-07-23 05:07:40.608420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.569 [2024-07-23 05:07:40.608496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2e2e8080 cdw11:8080802e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.569 [2024-07-23 05:07:40.608515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.569 #38 NEW cov: 11750 ft: 14504 corp: 23/307b lim: 40 exec/s: 38 rss: 69Mb L: 18/39 MS: 1 InsertRepeatedBytes- 00:08:09.828 [2024-07-23 05:07:40.668778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2eca cdw11:cacacaca SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.828 [2024-07-23 05:07:40.668810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.828 [2024-07-23 05:07:40.668877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:cacacaca cdw11:cacacaca SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.828 [2024-07-23 05:07:40.668895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.828 [2024-07-23 05:07:40.668958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:cacacaca cdw11:cacacaca SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.828 [2024-07-23 05:07:40.668976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.828 #39 NEW cov: 11750 ft: 14520 corp: 24/335b lim: 40 exec/s: 39 rss: 70Mb L: 28/39 MS: 1 CopyPart- 00:08:09.828 [2024-07-23 05:07:40.729127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.828 [2024-07-23 05:07:40.729157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.828 [2024-07-23 05:07:40.729225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.828 [2024-07-23 05:07:40.729244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.828 [2024-07-23 05:07:40.729310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00022e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.828 [2024-07-23 05:07:40.729329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.828 [2024-07-23 05:07:40.729392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:2e2eb32e cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.828 [2024-07-23 05:07:40.729410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.828 #40 NEW cov: 11750 ft: 14540 corp: 25/374b lim: 40 exec/s: 40 rss: 70Mb L: 39/39 MS: 1 CrossOver- 00:08:09.828 [2024-07-23 05:07:40.788791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e36 cdw11:2e022e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.828 [2024-07-23 05:07:40.788823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.828 #41 NEW cov: 11750 ft: 14550 corp: 26/382b lim: 40 exec/s: 41 rss: 70Mb L: 8/39 MS: 1 ChangeBinInt- 00:08:09.828 [2024-07-23 05:07:40.828844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2ea92e2e cdw11:2e032e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.828 [2024-07-23 05:07:40.828875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.828 #42 NEW cov: 11750 ft: 14599 corp: 27/390b lim: 40 exec/s: 42 rss: 70Mb L: 8/39 MS: 1 ChangeByte- 00:08:09.828 [2024-07-23 05:07:40.869017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8bc20de0 cdw11:52e33000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.828 [2024-07-23 05:07:40.869049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.828 #45 NEW cov: 11750 ft: 14614 corp: 28/401b lim: 40 exec/s: 45 rss: 70Mb L: 11/39 MS: 3 ShuffleBytes-CMP-CMP- DE: "\011\000"-"\213\302\015\340R\3430\000"- 00:08:09.828 [2024-07-23 05:07:40.909096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e0a2e cdw11:362e022e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.828 [2024-07-23 05:07:40.909127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.088 #46 NEW cov: 11750 ft: 14644 corp: 29/410b lim: 40 exec/s: 46 rss: 70Mb L: 9/39 MS: 1 CrossOver- 00:08:10.088 [2024-07-23 05:07:40.959275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2eff02ae cdw11:09002eb3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.088 [2024-07-23 05:07:40.959306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.088 #47 NEW cov: 11750 ft: 14649 corp: 30/420b lim: 40 exec/s: 47 rss: 70Mb L: 10/39 MS: 1 PersAutoDict- DE: "\011\000"- 00:08:10.088 [2024-07-23 05:07:41.009375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02ae2e2e cdw11:2eb32eae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.088 [2024-07-23 05:07:41.009406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.088 #48 NEW cov: 11750 ft: 14651 corp: 31/428b lim: 40 exec/s: 48 rss: 70Mb L: 8/39 MS: 1 CopyPart- 00:08:10.088 [2024-07-23 05:07:41.049675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.088 [2024-07-23 05:07:41.049707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.088 [2024-07-23 05:07:41.049774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.088 [2024-07-23 05:07:41.049793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.088 #49 NEW cov: 11750 ft: 14691 corp: 32/444b lim: 40 exec/s: 49 rss: 70Mb L: 16/39 MS: 1 CMP- DE: "\000\000\000\000\000\000\004\000"- 00:08:10.088 [2024-07-23 05:07:41.099655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:2e0a2e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.088 [2024-07-23 05:07:41.099686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.088 #50 NEW cov: 11750 ft: 14737 corp: 33/452b lim: 40 exec/s: 50 rss: 70Mb L: 8/39 MS: 1 CopyPart- 00:08:10.088 [2024-07-23 05:07:41.139764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.088 [2024-07-23 05:07:41.139795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.088 #51 NEW cov: 11750 ft: 14743 corp: 34/462b lim: 40 exec/s: 25 rss: 70Mb L: 10/39 MS: 1 InsertRepeatedBytes- 00:08:10.088 #51 DONE cov: 11750 ft: 14743 corp: 34/462b lim: 40 exec/s: 25 rss: 70Mb 00:08:10.088 ###### Recommended dictionary. ###### 00:08:10.088 "\011\000" # Uses: 1 00:08:10.088 "\213\302\015\340R\3430\000" # Uses: 0 00:08:10.088 "\000\000\000\000\000\000\004\000" # Uses: 0 00:08:10.088 ###### End of recommended dictionary. ###### 00:08:10.088 Done 51 runs in 2 second(s) 00:08:10.347 05:07:41 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:08:10.347 05:07:41 -- ../common.sh@72 -- # (( i++ )) 00:08:10.348 05:07:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.348 05:07:41 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:10.348 05:07:41 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:10.348 05:07:41 -- nvmf/run.sh@24 -- # local timen=1 00:08:10.348 05:07:41 -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.348 05:07:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:10.348 05:07:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:10.348 05:07:41 -- nvmf/run.sh@29 -- # printf %02d 12 00:08:10.348 05:07:41 -- nvmf/run.sh@29 -- # port=4412 00:08:10.348 05:07:41 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:10.348 05:07:41 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:10.348 05:07:41 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.348 05:07:41 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:08:10.348 [2024-07-23 05:07:41.343917] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:10.348 [2024-07-23 05:07:41.343987] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3144284 ] 00:08:10.348 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.607 [2024-07-23 05:07:41.560171] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.607 [2024-07-23 05:07:41.636204] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.607 [2024-07-23 05:07:41.636383] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.607 [2024-07-23 05:07:41.697396] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.866 [2024-07-23 05:07:41.713760] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:10.866 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.866 INFO: Seed: 3757036393 00:08:10.866 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:10.866 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:10.866 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:10.866 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.866 #2 INITED exec/s: 0 rss: 60Mb 00:08:10.866 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.866 This may also happen if the target rejected all inputs we tried so far 00:08:10.866 [2024-07-23 05:07:41.790073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000105b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.866 [2024-07-23 05:07:41.790115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.125 NEW_FUNC[1/671]: 0x4915d0 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:11.125 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:11.125 #11 NEW cov: 11521 ft: 11520 corp: 2/11b lim: 40 exec/s: 0 rss: 67Mb L: 10/10 MS: 4 ShuffleBytes-ChangeByte-CMP-CopyPart- DE: "\000\000\000\020"- 00:08:11.384 [2024-07-23 05:07:42.242136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.242182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.384 [2024-07-23 05:07:42.242316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.242343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.384 [2024-07-23 05:07:42.242491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.242514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.384 [2024-07-23 05:07:42.242648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.242671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.384 #14 NEW cov: 11634 ft: 12929 corp: 3/45b lim: 40 exec/s: 0 rss: 67Mb L: 34/34 MS: 3 InsertByte-CrossOver-InsertRepeatedBytes- 00:08:11.384 [2024-07-23 05:07:42.311313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2b0100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.311347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.384 #18 NEW cov: 11640 ft: 13119 corp: 4/55b lim: 40 exec/s: 0 rss: 67Mb L: 10/34 MS: 4 InsertByte-ChangeByte-CopyPart-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:11.384 [2024-07-23 05:07:42.362299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a9b9b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.362334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.384 [2024-07-23 05:07:42.362469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:9b6a6a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.362491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.384 [2024-07-23 05:07:42.362630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.362650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.384 [2024-07-23 05:07:42.362775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.362796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.384 #19 NEW cov: 11725 ft: 13374 corp: 5/92b lim: 40 exec/s: 0 rss: 67Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:11.384 [2024-07-23 05:07:42.432571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.432606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.384 [2024-07-23 05:07:42.432739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.432760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.384 [2024-07-23 05:07:42.432885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.432907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.384 [2024-07-23 05:07:42.433050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.384 [2024-07-23 05:07:42.433072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.384 #29 NEW cov: 11725 ft: 13585 corp: 6/125b lim: 40 exec/s: 0 rss: 67Mb L: 33/37 MS: 5 ChangeBit-ShuffleBytes-CopyPart-ChangeByte-InsertRepeatedBytes- 00:08:11.644 [2024-07-23 05:07:42.492781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f575757 cdw11:57000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.492815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.644 [2024-07-23 05:07:42.492942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:10575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.492962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.644 [2024-07-23 05:07:42.493098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.493120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.644 [2024-07-23 05:07:42.493207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.493228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.644 #30 NEW cov: 11725 ft: 13703 corp: 7/158b lim: 40 exec/s: 0 rss: 67Mb L: 33/37 MS: 1 PersAutoDict- DE: "\000\000\000\020"- 00:08:11.644 [2024-07-23 05:07:42.562980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.563015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.644 [2024-07-23 05:07:42.563147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.563169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.644 [2024-07-23 05:07:42.563296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.563317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.644 [2024-07-23 05:07:42.563448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.563471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.644 #31 NEW cov: 11725 ft: 13738 corp: 8/194b lim: 40 exec/s: 0 rss: 67Mb L: 36/37 MS: 1 CopyPart- 00:08:11.644 [2024-07-23 05:07:42.623135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.623169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.644 [2024-07-23 05:07:42.623301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.623327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.644 [2024-07-23 05:07:42.623472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.623494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.644 [2024-07-23 05:07:42.623624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.623649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.644 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.644 #32 NEW cov: 11748 ft: 13789 corp: 9/227b lim: 40 exec/s: 0 rss: 67Mb L: 33/37 MS: 1 ShuffleBytes- 00:08:11.644 [2024-07-23 05:07:42.682537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:0000105b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.644 [2024-07-23 05:07:42.682573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.644 #33 NEW cov: 11748 ft: 13812 corp: 10/237b lim: 40 exec/s: 0 rss: 68Mb L: 10/37 MS: 1 ChangeBinInt- 00:08:11.903 [2024-07-23 05:07:42.752594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2bfa00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.903 [2024-07-23 05:07:42.752628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.903 #34 NEW cov: 11748 ft: 13858 corp: 11/247b lim: 40 exec/s: 34 rss: 68Mb L: 10/37 MS: 1 ChangeBinInt- 00:08:11.903 [2024-07-23 05:07:42.823758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a9b9b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.903 [2024-07-23 05:07:42.823793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.903 [2024-07-23 05:07:42.823925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:9b6a6a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.903 [2024-07-23 05:07:42.823947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.903 [2024-07-23 05:07:42.824079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:6a6a6a6a cdw11:6a686a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.903 [2024-07-23 05:07:42.824101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.903 [2024-07-23 05:07:42.824237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.903 [2024-07-23 05:07:42.824259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.903 #35 NEW cov: 11748 ft: 13982 corp: 12/284b lim: 40 exec/s: 35 rss: 68Mb L: 37/37 MS: 1 ChangeBit- 00:08:11.903 [2024-07-23 05:07:42.893956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.903 [2024-07-23 05:07:42.893990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.903 [2024-07-23 05:07:42.894122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a6a0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.903 [2024-07-23 05:07:42.894146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.903 [2024-07-23 05:07:42.894287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:105b105b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.903 [2024-07-23 05:07:42.894310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.903 [2024-07-23 05:07:42.894448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.903 [2024-07-23 05:07:42.894470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.903 #36 NEW cov: 11748 ft: 13987 corp: 13/318b lim: 40 exec/s: 36 rss: 68Mb L: 34/37 MS: 1 CrossOver- 00:08:11.903 [2024-07-23 05:07:42.954129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f575757 cdw11:57000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.903 [2024-07-23 05:07:42.954164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.903 [2024-07-23 05:07:42.954300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:10575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.904 [2024-07-23 05:07:42.954323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.904 [2024-07-23 05:07:42.954458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:57255757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.904 [2024-07-23 05:07:42.954481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.904 [2024-07-23 05:07:42.954619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.904 [2024-07-23 05:07:42.954641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.904 #37 NEW cov: 11748 ft: 14034 corp: 14/352b lim: 40 exec/s: 37 rss: 68Mb L: 34/37 MS: 1 InsertByte- 00:08:12.163 [2024-07-23 05:07:43.024380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f575757 cdw11:5757574d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.163 [2024-07-23 05:07:43.024415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.163 [2024-07-23 05:07:43.024561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.163 [2024-07-23 05:07:43.024584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.163 [2024-07-23 05:07:43.024715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.163 [2024-07-23 05:07:43.024737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.163 [2024-07-23 05:07:43.024835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.163 [2024-07-23 05:07:43.024860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.163 #38 NEW cov: 11748 ft: 14068 corp: 15/388b lim: 40 exec/s: 38 rss: 68Mb L: 36/37 MS: 1 ChangeBinInt- 00:08:12.163 [2024-07-23 05:07:43.094526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a9b9b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.163 [2024-07-23 05:07:43.094560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.163 [2024-07-23 05:07:43.094700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:9b6a6a6a cdw11:6a5b105b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.163 [2024-07-23 05:07:43.094720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.163 [2024-07-23 05:07:43.094853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.163 [2024-07-23 05:07:43.094875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.163 [2024-07-23 05:07:43.095005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94006a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.163 [2024-07-23 05:07:43.095028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.163 #39 NEW cov: 11748 ft: 14085 corp: 16/425b lim: 40 exec/s: 39 rss: 68Mb L: 37/37 MS: 1 CrossOver- 00:08:12.163 [2024-07-23 05:07:43.153872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2bfaf7 cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.163 [2024-07-23 05:07:43.153907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.163 #40 NEW cov: 11748 ft: 14092 corp: 17/435b lim: 40 exec/s: 40 rss: 68Mb L: 10/37 MS: 1 ChangeBinInt- 00:08:12.163 [2024-07-23 05:07:43.224127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2bfa6f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.163 [2024-07-23 05:07:43.224163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.163 #41 NEW cov: 11748 ft: 14163 corp: 18/446b lim: 40 exec/s: 41 rss: 68Mb L: 11/37 MS: 1 InsertByte- 00:08:12.421 [2024-07-23 05:07:43.285190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.421 [2024-07-23 05:07:43.285223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.421 [2024-07-23 05:07:43.285364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.422 [2024-07-23 05:07:43.285385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.422 [2024-07-23 05:07:43.285525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:57575757 cdw11:cc575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.422 [2024-07-23 05:07:43.285546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.422 [2024-07-23 05:07:43.285678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.422 [2024-07-23 05:07:43.285699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.422 #42 NEW cov: 11748 ft: 14262 corp: 19/480b lim: 40 exec/s: 42 rss: 68Mb L: 34/37 MS: 1 InsertByte- 00:08:12.422 [2024-07-23 05:07:43.354614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000500 cdw11:0000105b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.422 [2024-07-23 05:07:43.354648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.422 #43 NEW cov: 11748 ft: 14311 corp: 20/490b lim: 40 exec/s: 43 rss: 69Mb L: 10/37 MS: 1 CMP- DE: "\005\000\000\000"- 00:08:12.422 [2024-07-23 05:07:43.425682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f575757 cdw11:57000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.422 [2024-07-23 05:07:43.425722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.422 [2024-07-23 05:07:43.425850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:10575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.422 [2024-07-23 05:07:43.425872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.422 [2024-07-23 05:07:43.426009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:57255757 cdw11:57a9a457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.422 [2024-07-23 05:07:43.426028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.422 [2024-07-23 05:07:43.426158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.422 [2024-07-23 05:07:43.426180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.422 #44 NEW cov: 11748 ft: 14339 corp: 21/524b lim: 40 exec/s: 44 rss: 69Mb L: 34/37 MS: 1 ChangeBinInt- 00:08:12.422 [2024-07-23 05:07:43.495593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.422 [2024-07-23 05:07:43.495629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.422 [2024-07-23 05:07:43.495769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.422 [2024-07-23 05:07:43.495789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.422 [2024-07-23 05:07:43.495922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:cc575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.422 [2024-07-23 05:07:43.495945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.680 #45 NEW cov: 11748 ft: 14549 corp: 22/554b lim: 40 exec/s: 45 rss: 69Mb L: 30/37 MS: 1 EraseBytes- 00:08:12.680 [2024-07-23 05:07:43.566040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.680 [2024-07-23 05:07:43.566073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.680 [2024-07-23 05:07:43.566213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.680 [2024-07-23 05:07:43.566235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.680 [2024-07-23 05:07:43.566381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.680 [2024-07-23 05:07:43.566403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.681 [2024-07-23 05:07:43.566538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:57575757 cdw11:57575757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.681 [2024-07-23 05:07:43.566561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.681 #46 NEW cov: 11748 ft: 14575 corp: 23/590b lim: 40 exec/s: 46 rss: 69Mb L: 36/37 MS: 1 ShuffleBytes- 00:08:12.681 [2024-07-23 05:07:43.625448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a0094 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.681 [2024-07-23 05:07:43.625482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.681 #50 NEW cov: 11748 ft: 14642 corp: 24/601b lim: 40 exec/s: 50 rss: 69Mb L: 11/37 MS: 4 ShuffleBytes-ShuffleBytes-CopyPart-CrossOver- 00:08:12.681 [2024-07-23 05:07:43.676532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2b0100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.681 [2024-07-23 05:07:43.676568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.681 [2024-07-23 05:07:43.676715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:64646464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.681 [2024-07-23 05:07:43.676738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.681 [2024-07-23 05:07:43.676875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:64646464 cdw11:64646464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.681 [2024-07-23 05:07:43.676898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.681 [2024-07-23 05:07:43.676976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:64646464 cdw11:64646464 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.681 [2024-07-23 05:07:43.676998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.681 #51 NEW cov: 11748 ft: 14665 corp: 25/635b lim: 40 exec/s: 51 rss: 69Mb L: 34/37 MS: 1 InsertRepeatedBytes- 00:08:12.681 [2024-07-23 05:07:43.736734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a9b9b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.681 [2024-07-23 05:07:43.736769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.681 [2024-07-23 05:07:43.736901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:9b6a6a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.681 [2024-07-23 05:07:43.736923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.681 [2024-07-23 05:07:43.737056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:6a6a6a6a cdw11:6a686a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.681 [2024-07-23 05:07:43.737078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.681 [2024-07-23 05:07:43.737212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.681 [2024-07-23 05:07:43.737233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.940 #52 NEW cov: 11748 ft: 14675 corp: 26/672b lim: 40 exec/s: 26 rss: 69Mb L: 37/37 MS: 1 ShuffleBytes- 00:08:12.940 #52 DONE cov: 11748 ft: 14675 corp: 26/672b lim: 40 exec/s: 26 rss: 69Mb 00:08:12.940 ###### Recommended dictionary. ###### 00:08:12.940 "\000\000\000\020" # Uses: 1 00:08:12.940 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:12.940 "\005\000\000\000" # Uses: 0 00:08:12.940 ###### End of recommended dictionary. ###### 00:08:12.940 Done 52 runs in 2 second(s) 00:08:12.940 05:07:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:12.940 05:07:43 -- ../common.sh@72 -- # (( i++ )) 00:08:12.940 05:07:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.940 05:07:43 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:12.940 05:07:43 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:12.940 05:07:43 -- nvmf/run.sh@24 -- # local timen=1 00:08:12.940 05:07:43 -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.940 05:07:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:12.940 05:07:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:12.940 05:07:43 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:12.940 05:07:43 -- nvmf/run.sh@29 -- # port=4413 00:08:12.940 05:07:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:12.940 05:07:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:12.940 05:07:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.940 05:07:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:12.941 [2024-07-23 05:07:43.958067] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:12.941 [2024-07-23 05:07:43.958150] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3144827 ] 00:08:12.941 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.200 [2024-07-23 05:07:44.170724] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.200 [2024-07-23 05:07:44.246340] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:13.200 [2024-07-23 05:07:44.246519] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.459 [2024-07-23 05:07:44.307644] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.459 [2024-07-23 05:07:44.323962] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:13.459 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.459 INFO: Seed: 2071059877 00:08:13.459 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:13.459 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:13.459 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:13.459 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.459 #2 INITED exec/s: 0 rss: 60Mb 00:08:13.459 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.459 This may also happen if the target rejected all inputs we tried so far 00:08:13.459 [2024-07-23 05:07:44.369247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.459 [2024-07-23 05:07:44.369277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.718 NEW_FUNC[1/669]: 0x493190 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:13.718 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.718 #12 NEW cov: 11508 ft: 11509 corp: 2/10b lim: 40 exec/s: 0 rss: 67Mb L: 9/9 MS: 5 CopyPart-ChangeBit-ShuffleBytes-ShuffleBytes-CMP- DE: "+\000\000\000\000\000\000\000"- 00:08:13.718 [2024-07-23 05:07:44.800383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000000 cdw11:00780000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.718 [2024-07-23 05:07:44.800420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.977 NEW_FUNC[1/1]: 0xf90030 in spdk_sock_prep_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/sock.h:284 00:08:13.977 #13 NEW cov: 11622 ft: 12027 corp: 3/20b lim: 40 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertByte- 00:08:13.978 [2024-07-23 05:07:44.850409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.978 [2024-07-23 05:07:44.850434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.978 #19 NEW cov: 11628 ft: 12336 corp: 4/30b lim: 40 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 PersAutoDict- DE: "+\000\000\000\000\000\000\000"- 00:08:13.978 [2024-07-23 05:07:44.890531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000078 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.978 [2024-07-23 05:07:44.890560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.978 #20 NEW cov: 11713 ft: 12614 corp: 5/38b lim: 40 exec/s: 0 rss: 67Mb L: 8/10 MS: 1 EraseBytes- 00:08:13.978 [2024-07-23 05:07:44.930643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000078 cdw11:000a0002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.978 [2024-07-23 05:07:44.930668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.978 #26 NEW cov: 11713 ft: 12727 corp: 6/46b lim: 40 exec/s: 0 rss: 67Mb L: 8/10 MS: 1 ChangeBinInt- 00:08:13.978 [2024-07-23 05:07:44.970704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000000 cdw11:00000800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.978 [2024-07-23 05:07:44.970729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.978 #27 NEW cov: 11713 ft: 12766 corp: 7/56b lim: 40 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 ChangeBit- 00:08:13.978 [2024-07-23 05:07:45.010885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b001078 cdw11:000a0002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.978 [2024-07-23 05:07:45.010910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.978 #28 NEW cov: 11713 ft: 12855 corp: 8/64b lim: 40 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 ChangeBit- 00:08:13.978 [2024-07-23 05:07:45.051009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b2b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.978 [2024-07-23 05:07:45.051034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.237 #34 NEW cov: 11713 ft: 12925 corp: 9/74b lim: 40 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 PersAutoDict- DE: "+\000\000\000\000\000\000\000"- 00:08:14.237 [2024-07-23 05:07:45.091106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000000 cdw11:00780000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.237 [2024-07-23 05:07:45.091131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.237 #40 NEW cov: 11713 ft: 12941 corp: 10/84b lim: 40 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:08:14.237 [2024-07-23 05:07:45.121227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b2b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.237 [2024-07-23 05:07:45.121251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.237 #41 NEW cov: 11713 ft: 12993 corp: 11/94b lim: 40 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 PersAutoDict- DE: "+\000\000\000\000\000\000\000"- 00:08:14.237 [2024-07-23 05:07:45.161353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000000 cdw11:00000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.237 [2024-07-23 05:07:45.161377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.237 #42 NEW cov: 11713 ft: 13020 corp: 12/103b lim: 40 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 ChangeBit- 00:08:14.237 [2024-07-23 05:07:45.201527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2bff0000 cdw11:00000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.237 [2024-07-23 05:07:45.201551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.237 #43 NEW cov: 11713 ft: 13045 corp: 13/112b lim: 40 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 ChangeByte- 00:08:14.237 [2024-07-23 05:07:45.241602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000000 cdw11:00780000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.237 [2024-07-23 05:07:45.241627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.237 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:14.237 #44 NEW cov: 11736 ft: 13104 corp: 14/125b lim: 40 exec/s: 0 rss: 68Mb L: 13/13 MS: 1 CrossOver- 00:08:14.237 [2024-07-23 05:07:45.281703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b00002b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.237 [2024-07-23 05:07:45.281728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.237 #45 NEW cov: 11736 ft: 13114 corp: 15/134b lim: 40 exec/s: 0 rss: 68Mb L: 9/13 MS: 1 CopyPart- 00:08:14.237 [2024-07-23 05:07:45.311767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b00002b cdw11:00c20000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.237 [2024-07-23 05:07:45.311791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.499 #46 NEW cov: 11736 ft: 13134 corp: 16/143b lim: 40 exec/s: 0 rss: 68Mb L: 9/13 MS: 1 ChangeByte- 00:08:14.499 [2024-07-23 05:07:45.352068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.499 [2024-07-23 05:07:45.352092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.499 [2024-07-23 05:07:45.352153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.499 [2024-07-23 05:07:45.352167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.499 #51 NEW cov: 11736 ft: 13494 corp: 17/164b lim: 40 exec/s: 51 rss: 69Mb L: 21/21 MS: 5 EraseBytes-ChangeByte-CrossOver-CrossOver-InsertRepeatedBytes- 00:08:14.499 [2024-07-23 05:07:45.392024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000000 cdw11:00780000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.499 [2024-07-23 05:07:45.392049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.499 #52 NEW cov: 11736 ft: 13563 corp: 18/177b lim: 40 exec/s: 52 rss: 69Mb L: 13/21 MS: 1 CrossOver- 00:08:14.499 [2024-07-23 05:07:45.432143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b001078 cdw11:000a0002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.499 [2024-07-23 05:07:45.432167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.499 #53 NEW cov: 11736 ft: 13609 corp: 19/185b lim: 40 exec/s: 53 rss: 69Mb L: 8/21 MS: 1 ShuffleBytes- 00:08:14.499 [2024-07-23 05:07:45.472309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b001078 cdw11:c70a0002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.499 [2024-07-23 05:07:45.472333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.499 #54 NEW cov: 11736 ft: 13646 corp: 20/193b lim: 40 exec/s: 54 rss: 69Mb L: 8/21 MS: 1 ChangeByte- 00:08:14.499 [2024-07-23 05:07:45.512402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000000 cdw11:04000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.499 [2024-07-23 05:07:45.512425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.499 #55 NEW cov: 11736 ft: 13670 corp: 21/202b lim: 40 exec/s: 55 rss: 69Mb L: 9/21 MS: 1 ChangeBinInt- 00:08:14.500 [2024-07-23 05:07:45.542459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:4a2b0010 cdw11:78000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.500 [2024-07-23 05:07:45.542483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.500 #63 NEW cov: 11736 ft: 13681 corp: 22/211b lim: 40 exec/s: 63 rss: 69Mb L: 9/21 MS: 3 ChangeBit-ShuffleBytes-CrossOver- 00:08:14.500 [2024-07-23 05:07:45.572618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b00100a cdw11:000a0002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.500 [2024-07-23 05:07:45.572643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.793 #64 NEW cov: 11736 ft: 13701 corp: 23/219b lim: 40 exec/s: 64 rss: 69Mb L: 8/21 MS: 1 CrossOver- 00:08:14.793 [2024-07-23 05:07:45.612713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2bff0000 cdw11:00004020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.793 [2024-07-23 05:07:45.612739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.793 #65 NEW cov: 11736 ft: 13740 corp: 24/228b lim: 40 exec/s: 65 rss: 69Mb L: 9/21 MS: 1 ChangeBit- 00:08:14.793 [2024-07-23 05:07:45.653223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000000 cdw11:00aaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.793 [2024-07-23 05:07:45.653249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.793 [2024-07-23 05:07:45.653322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.793 [2024-07-23 05:07:45.653337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.793 [2024-07-23 05:07:45.653395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.793 [2024-07-23 05:07:45.653408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.793 [2024-07-23 05:07:45.653457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:aaaaaaaa cdw11:aaaa7800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.793 [2024-07-23 05:07:45.653470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.793 #66 NEW cov: 11736 ft: 14238 corp: 25/263b lim: 40 exec/s: 66 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:14.793 [2024-07-23 05:07:45.692942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:4a3b0010 cdw11:78000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.793 [2024-07-23 05:07:45.692967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.793 #67 NEW cov: 11736 ft: 14247 corp: 26/272b lim: 40 exec/s: 67 rss: 69Mb L: 9/35 MS: 1 ChangeByte- 00:08:14.793 [2024-07-23 05:07:45.733037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000000 cdw11:000a0002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.793 [2024-07-23 05:07:45.733065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.793 #68 NEW cov: 11736 ft: 14283 corp: 27/280b lim: 40 exec/s: 68 rss: 69Mb L: 8/35 MS: 1 CrossOver- 00:08:14.793 [2024-07-23 05:07:45.773130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000078 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.793 [2024-07-23 05:07:45.773154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.793 #69 NEW cov: 11736 ft: 14307 corp: 28/288b lim: 40 exec/s: 69 rss: 69Mb L: 8/35 MS: 1 ChangeByte- 00:08:14.793 [2024-07-23 05:07:45.803359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b00002b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.793 [2024-07-23 05:07:45.803383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.793 [2024-07-23 05:07:45.803446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00780000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.793 [2024-07-23 05:07:45.803459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.793 #70 NEW cov: 11736 ft: 14326 corp: 29/306b lim: 40 exec/s: 70 rss: 69Mb L: 18/35 MS: 1 PersAutoDict- DE: "+\000\000\000\000\000\000\000"- 00:08:14.793 [2024-07-23 05:07:45.843362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b001008 cdw11:000a0002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.793 [2024-07-23 05:07:45.843387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.053 #71 NEW cov: 11736 ft: 14334 corp: 30/314b lim: 40 exec/s: 71 rss: 69Mb L: 8/35 MS: 1 ChangeBit- 00:08:15.053 [2024-07-23 05:07:45.883510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b2b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.053 [2024-07-23 05:07:45.883536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.053 #72 NEW cov: 11736 ft: 14347 corp: 31/323b lim: 40 exec/s: 72 rss: 69Mb L: 9/35 MS: 1 PersAutoDict- DE: "+\000\000\000\000\000\000\000"- 00:08:15.053 [2024-07-23 05:07:45.923626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b00000a cdw11:04000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.053 [2024-07-23 05:07:45.923652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.053 #73 NEW cov: 11736 ft: 14356 corp: 32/332b lim: 40 exec/s: 73 rss: 70Mb L: 9/35 MS: 1 ChangeBinInt- 00:08:15.053 [2024-07-23 05:07:45.963779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b00002b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.053 [2024-07-23 05:07:45.963804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.053 #74 NEW cov: 11736 ft: 14363 corp: 33/345b lim: 40 exec/s: 74 rss: 70Mb L: 13/35 MS: 1 PersAutoDict- DE: "+\000\000\000\000\000\000\000"- 00:08:15.053 [2024-07-23 05:07:46.003823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000086 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.053 [2024-07-23 05:07:46.003848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.053 #75 NEW cov: 11736 ft: 14372 corp: 34/354b lim: 40 exec/s: 75 rss: 70Mb L: 9/35 MS: 1 InsertByte- 00:08:15.053 [2024-07-23 05:07:46.043992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000878 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.053 [2024-07-23 05:07:46.044020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.053 #76 NEW cov: 11736 ft: 14377 corp: 35/362b lim: 40 exec/s: 76 rss: 70Mb L: 8/35 MS: 1 ChangeBit- 00:08:15.053 [2024-07-23 05:07:46.084136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b2b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.053 [2024-07-23 05:07:46.084161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.053 #77 NEW cov: 11736 ft: 14394 corp: 36/370b lim: 40 exec/s: 77 rss: 70Mb L: 8/35 MS: 1 EraseBytes- 00:08:15.053 [2024-07-23 05:07:46.124241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b000078 cdw11:00780000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.053 [2024-07-23 05:07:46.124265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.053 #78 NEW cov: 11736 ft: 14402 corp: 37/383b lim: 40 exec/s: 78 rss: 70Mb L: 13/35 MS: 1 CopyPart- 00:08:15.313 [2024-07-23 05:07:46.164395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b00002b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.313 [2024-07-23 05:07:46.164421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.313 #79 NEW cov: 11736 ft: 14418 corp: 38/393b lim: 40 exec/s: 79 rss: 70Mb L: 10/35 MS: 1 EraseBytes- 00:08:15.313 [2024-07-23 05:07:46.204623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b00002b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.313 [2024-07-23 05:07:46.204648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.313 [2024-07-23 05:07:46.204706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000078 cdw11:2b00002b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.313 [2024-07-23 05:07:46.204719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.313 #80 NEW cov: 11736 ft: 14436 corp: 39/415b lim: 40 exec/s: 80 rss: 70Mb L: 22/35 MS: 1 CrossOver- 00:08:15.313 [2024-07-23 05:07:46.244605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b2b0000 cdw11:08000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.313 [2024-07-23 05:07:46.244630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.313 #81 NEW cov: 11736 ft: 14446 corp: 40/423b lim: 40 exec/s: 81 rss: 70Mb L: 8/35 MS: 1 ChangeBinInt- 00:08:15.313 [2024-07-23 05:07:46.284727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:4a2b0000 cdw11:100a7800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.313 [2024-07-23 05:07:46.284752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.313 #82 NEW cov: 11736 ft: 14458 corp: 41/432b lim: 40 exec/s: 82 rss: 70Mb L: 9/35 MS: 1 ShuffleBytes- 00:08:15.313 [2024-07-23 05:07:46.324819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2b001008 cdw11:00100002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.313 [2024-07-23 05:07:46.324844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.313 #83 NEW cov: 11736 ft: 14514 corp: 42/440b lim: 40 exec/s: 83 rss: 70Mb L: 8/35 MS: 1 ChangeBinInt- 00:08:15.313 [2024-07-23 05:07:46.364958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:27000078 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.313 [2024-07-23 05:07:46.364983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.313 #84 NEW cov: 11736 ft: 14528 corp: 43/448b lim: 40 exec/s: 42 rss: 70Mb L: 8/35 MS: 1 ChangeBinInt- 00:08:15.313 #84 DONE cov: 11736 ft: 14528 corp: 43/448b lim: 40 exec/s: 42 rss: 70Mb 00:08:15.313 ###### Recommended dictionary. ###### 00:08:15.313 "+\000\000\000\000\000\000\000" # Uses: 6 00:08:15.313 ###### End of recommended dictionary. ###### 00:08:15.313 Done 84 runs in 2 second(s) 00:08:15.572 05:07:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:15.572 05:07:46 -- ../common.sh@72 -- # (( i++ )) 00:08:15.572 05:07:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.572 05:07:46 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:15.572 05:07:46 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:15.572 05:07:46 -- nvmf/run.sh@24 -- # local timen=1 00:08:15.572 05:07:46 -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.572 05:07:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:15.572 05:07:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:15.572 05:07:46 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:15.572 05:07:46 -- nvmf/run.sh@29 -- # port=4414 00:08:15.572 05:07:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:15.572 05:07:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:15.572 05:07:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.573 05:07:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:15.573 [2024-07-23 05:07:46.560570] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:15.573 [2024-07-23 05:07:46.560640] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3145285 ] 00:08:15.573 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.832 [2024-07-23 05:07:46.776679] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.832 [2024-07-23 05:07:46.852545] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:15.832 [2024-07-23 05:07:46.852721] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.832 [2024-07-23 05:07:46.913674] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:16.091 [2024-07-23 05:07:46.930041] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:16.091 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.091 INFO: Seed: 383097902 00:08:16.091 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:16.091 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:16.091 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:16.091 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.091 #2 INITED exec/s: 0 rss: 60Mb 00:08:16.091 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.091 This may also happen if the target rejected all inputs we tried so far 00:08:16.091 [2024-07-23 05:07:46.985619] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.091 [2024-07-23 05:07:46.985657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.091 [2024-07-23 05:07:46.985722] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES SOFTWARE PROGRESS MARKER cid:5 cdw10:80000080 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.091 [2024-07-23 05:07:46.985743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.350 NEW_FUNC[1/671]: 0x494d50 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:16.350 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:16.350 #12 NEW cov: 11510 ft: 11511 corp: 2/17b lim: 35 exec/s: 0 rss: 67Mb L: 16/16 MS: 5 InsertByte-ChangeByte-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:16.351 [2024-07-23 05:07:47.427035] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.351 [2024-07-23 05:07:47.427075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.351 [2024-07-23 05:07:47.427145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.351 [2024-07-23 05:07:47.427164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.610 NEW_FUNC[1/2]: 0x4b60f0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:16.610 NEW_FUNC[2/2]: 0x1153fb0 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1651 00:08:16.610 #13 NEW cov: 11656 ft: 12177 corp: 3/41b lim: 35 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:16.610 [2024-07-23 05:07:47.476851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.476886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.610 [2024-07-23 05:07:47.476959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.476980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.610 #14 NEW cov: 11662 ft: 12523 corp: 4/59b lim: 35 exec/s: 0 rss: 67Mb L: 18/24 MS: 1 InsertRepeatedBytes- 00:08:16.610 [2024-07-23 05:07:47.516917] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.516949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.610 [2024-07-23 05:07:47.517018] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.517039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.610 #15 NEW cov: 11747 ft: 12751 corp: 5/76b lim: 35 exec/s: 0 rss: 67Mb L: 17/24 MS: 1 InsertByte- 00:08:16.610 [2024-07-23 05:07:47.577431] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.577468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.610 [2024-07-23 05:07:47.577539] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.577560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.610 [2024-07-23 05:07:47.577624] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.577642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.610 #16 NEW cov: 11747 ft: 13061 corp: 6/105b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:08:16.610 [2024-07-23 05:07:47.637201] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.637232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.610 [2024-07-23 05:07:47.637302] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.637320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.610 #17 NEW cov: 11747 ft: 13135 corp: 7/121b lim: 35 exec/s: 0 rss: 67Mb L: 16/29 MS: 1 CMP- DE: "\377\033"- 00:08:16.610 [2024-07-23 05:07:47.677776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.677807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.610 [2024-07-23 05:07:47.677875] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.677896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.610 [2024-07-23 05:07:47.677963] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.610 [2024-07-23 05:07:47.677981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.870 #18 NEW cov: 11747 ft: 13170 corp: 8/150b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ShuffleBytes- 00:08:16.870 [2024-07-23 05:07:47.737786] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.737819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.870 [2024-07-23 05:07:47.737890] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.737909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.870 #24 NEW cov: 11747 ft: 13197 corp: 9/174b lim: 35 exec/s: 0 rss: 68Mb L: 24/29 MS: 1 ChangeBit- 00:08:16.870 [2024-07-23 05:07:47.788086] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.788118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.870 [2024-07-23 05:07:47.788187] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.788206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.870 [2024-07-23 05:07:47.788272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES SOFTWARE PROGRESS MARKER cid:6 cdw10:80000080 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.788293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.870 [2024-07-23 05:07:47.788361] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000030 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.788379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.870 #25 NEW cov: 11747 ft: 13362 corp: 10/204b lim: 35 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:16.870 [2024-07-23 05:07:47.848266] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.848302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.870 [2024-07-23 05:07:47.848368] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.848386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.870 [2024-07-23 05:07:47.848455] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.848473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.870 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.870 #30 NEW cov: 11770 ft: 13413 corp: 11/235b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 5 ShuffleBytes-CopyPart-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:16.870 [2024-07-23 05:07:47.898246] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.898279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.870 [2024-07-23 05:07:47.898353] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000002f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.898372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.870 #31 NEW cov: 11770 ft: 13424 corp: 12/259b lim: 35 exec/s: 0 rss: 68Mb L: 24/31 MS: 1 ChangeByte- 00:08:16.870 [2024-07-23 05:07:47.948728] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES SOFTWARE PROGRESS MARKER cid:5 cdw10:80000080 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.948762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.870 [2024-07-23 05:07:47.948830] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.948848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.870 [2024-07-23 05:07:47.948917] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000008d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.948938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.870 [2024-07-23 05:07:47.949004] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.870 [2024-07-23 05:07:47.949021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.130 #32 NEW cov: 11770 ft: 13508 corp: 13/294b lim: 35 exec/s: 32 rss: 68Mb L: 35/35 MS: 1 CrossOver- 00:08:17.130 [2024-07-23 05:07:47.998576] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.130 [2024-07-23 05:07:47.998610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.130 [2024-07-23 05:07:47.998676] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000002f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.130 [2024-07-23 05:07:47.998694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.130 #33 NEW cov: 11770 ft: 13527 corp: 14/317b lim: 35 exec/s: 33 rss: 68Mb L: 23/35 MS: 1 EraseBytes- 00:08:17.130 [2024-07-23 05:07:48.058325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.130 [2024-07-23 05:07:48.058361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.130 #38 NEW cov: 11770 ft: 14243 corp: 15/329b lim: 35 exec/s: 38 rss: 68Mb L: 12/35 MS: 5 CopyPart-ChangeBit-CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:17.130 [2024-07-23 05:07:48.108860] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.130 [2024-07-23 05:07:48.108893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.130 [2024-07-23 05:07:48.108964] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000002f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.130 [2024-07-23 05:07:48.108983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.130 #39 NEW cov: 11770 ft: 14284 corp: 16/352b lim: 35 exec/s: 39 rss: 68Mb L: 23/35 MS: 1 ShuffleBytes- 00:08:17.130 [2024-07-23 05:07:48.168774] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000de SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.130 [2024-07-23 05:07:48.168806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.130 [2024-07-23 05:07:48.168878] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.130 [2024-07-23 05:07:48.168900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.130 #40 NEW cov: 11770 ft: 14337 corp: 17/369b lim: 35 exec/s: 40 rss: 68Mb L: 17/35 MS: 1 ChangeBit- 00:08:17.130 [2024-07-23 05:07:48.219229] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.130 [2024-07-23 05:07:48.219261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.130 [2024-07-23 05:07:48.219330] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.130 [2024-07-23 05:07:48.219351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.130 [2024-07-23 05:07:48.219419] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES SOFTWARE PROGRESS MARKER cid:6 cdw10:80000080 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.130 [2024-07-23 05:07:48.219447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.130 [2024-07-23 05:07:48.219518] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000030 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.130 [2024-07-23 05:07:48.219536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.390 #41 NEW cov: 11770 ft: 14348 corp: 18/399b lim: 35 exec/s: 41 rss: 68Mb L: 30/35 MS: 1 PersAutoDict- DE: "\377\033"- 00:08:17.390 [2024-07-23 05:07:48.269480] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.269511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.390 [2024-07-23 05:07:48.269585] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000f9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.269606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.390 [2024-07-23 05:07:48.269677] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.269699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.390 #42 NEW cov: 11770 ft: 14463 corp: 19/428b lim: 35 exec/s: 42 rss: 68Mb L: 29/35 MS: 1 ChangeBinInt- 00:08:17.390 [2024-07-23 05:07:48.309484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.309516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.390 [2024-07-23 05:07:48.309593] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.309615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.390 [2024-07-23 05:07:48.309681] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.309698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.390 [2024-07-23 05:07:48.309764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000030 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.309782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.390 #43 NEW cov: 11770 ft: 14475 corp: 20/462b lim: 35 exec/s: 43 rss: 68Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:08:17.390 [2024-07-23 05:07:48.359894] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.359926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.390 [2024-07-23 05:07:48.359996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000002f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.360015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.390 [2024-07-23 05:07:48.360083] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.360101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.390 [2024-07-23 05:07:48.360167] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.360185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.390 #44 NEW cov: 11770 ft: 14494 corp: 21/497b lim: 35 exec/s: 44 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:17.390 [2024-07-23 05:07:48.419493] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.419528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.390 [2024-07-23 05:07:48.419598] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.419619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.390 #45 NEW cov: 11770 ft: 14504 corp: 22/515b lim: 35 exec/s: 45 rss: 68Mb L: 18/35 MS: 1 ChangeBinInt- 00:08:17.390 [2024-07-23 05:07:48.469394] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.390 [2024-07-23 05:07:48.469425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.650 #46 NEW cov: 11770 ft: 14532 corp: 23/527b lim: 35 exec/s: 46 rss: 69Mb L: 12/35 MS: 1 CopyPart- 00:08:17.650 [2024-07-23 05:07:48.530214] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.530246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.650 [2024-07-23 05:07:48.530315] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.530334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.650 [2024-07-23 05:07:48.530403] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.530421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.650 #47 NEW cov: 11770 ft: 14557 corp: 24/560b lim: 35 exec/s: 47 rss: 69Mb L: 33/35 MS: 1 CrossOver- 00:08:17.650 [2024-07-23 05:07:48.579900] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.579931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.650 [2024-07-23 05:07:48.580000] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES SOFTWARE PROGRESS MARKER cid:5 cdw10:80000080 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.580021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.650 #48 NEW cov: 11770 ft: 14643 corp: 25/576b lim: 35 exec/s: 48 rss: 69Mb L: 16/35 MS: 1 ChangeByte- 00:08:17.650 [2024-07-23 05:07:48.620351] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.620382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.650 [2024-07-23 05:07:48.620456] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.620477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.650 [2024-07-23 05:07:48.620544] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.620562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.650 [2024-07-23 05:07:48.620626] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000030 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.620644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.650 #49 NEW cov: 11770 ft: 14670 corp: 26/610b lim: 35 exec/s: 49 rss: 69Mb L: 34/35 MS: 1 ChangeBit- 00:08:17.650 [2024-07-23 05:07:48.680092] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.680124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.650 #50 NEW cov: 11770 ft: 14686 corp: 27/622b lim: 35 exec/s: 50 rss: 69Mb L: 12/35 MS: 1 ShuffleBytes- 00:08:17.650 [2024-07-23 05:07:48.740378] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000de SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.740410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.650 [2024-07-23 05:07:48.740488] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.650 [2024-07-23 05:07:48.740510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.910 #51 NEW cov: 11770 ft: 14720 corp: 28/639b lim: 35 exec/s: 51 rss: 69Mb L: 17/35 MS: 1 ShuffleBytes- 00:08:17.910 [2024-07-23 05:07:48.790592] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.790624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.910 [2024-07-23 05:07:48.790695] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.790717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.910 #52 NEW cov: 11770 ft: 14736 corp: 29/656b lim: 35 exec/s: 52 rss: 69Mb L: 17/35 MS: 1 ChangeBinInt- 00:08:17.910 [2024-07-23 05:07:48.830883] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.830914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.910 [2024-07-23 05:07:48.830985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.831004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.910 #53 NEW cov: 11770 ft: 14742 corp: 30/679b lim: 35 exec/s: 53 rss: 69Mb L: 23/35 MS: 1 ChangeBinInt- 00:08:17.910 [2024-07-23 05:07:48.881006] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.881037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.910 [2024-07-23 05:07:48.881109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.881127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.910 #54 NEW cov: 11770 ft: 14771 corp: 31/701b lim: 35 exec/s: 54 rss: 69Mb L: 22/35 MS: 1 EraseBytes- 00:08:17.910 [2024-07-23 05:07:48.921258] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.921289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.910 [2024-07-23 05:07:48.921359] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.921380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.910 [2024-07-23 05:07:48.921451] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES SOFTWARE PROGRESS MARKER cid:6 cdw10:80000080 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.921471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.910 [2024-07-23 05:07:48.921542] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000030 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.921560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.910 #55 NEW cov: 11770 ft: 14782 corp: 32/731b lim: 35 exec/s: 55 rss: 69Mb L: 30/35 MS: 1 ChangeBinInt- 00:08:17.910 [2024-07-23 05:07:48.971434] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.971470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.910 [2024-07-23 05:07:48.971538] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.971559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.910 [2024-07-23 05:07:48.971626] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.910 [2024-07-23 05:07:48.971646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.910 #56 NEW cov: 11770 ft: 14816 corp: 33/765b lim: 35 exec/s: 28 rss: 69Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:08:17.910 #56 DONE cov: 11770 ft: 14816 corp: 33/765b lim: 35 exec/s: 28 rss: 69Mb 00:08:17.910 ###### Recommended dictionary. ###### 00:08:17.910 "\377\033" # Uses: 1 00:08:17.910 ###### End of recommended dictionary. ###### 00:08:17.910 Done 56 runs in 2 second(s) 00:08:18.170 05:07:49 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:18.170 05:07:49 -- ../common.sh@72 -- # (( i++ )) 00:08:18.170 05:07:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.170 05:07:49 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:18.170 05:07:49 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:18.170 05:07:49 -- nvmf/run.sh@24 -- # local timen=1 00:08:18.170 05:07:49 -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.170 05:07:49 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:18.170 05:07:49 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:18.170 05:07:49 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:18.170 05:07:49 -- nvmf/run.sh@29 -- # port=4415 00:08:18.170 05:07:49 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:18.170 05:07:49 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:18.170 05:07:49 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.170 05:07:49 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:18.170 [2024-07-23 05:07:49.181469] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:18.170 [2024-07-23 05:07:49.181537] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3145656 ] 00:08:18.170 EAL: No free 2048 kB hugepages reported on node 1 00:08:18.429 [2024-07-23 05:07:49.400417] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.429 [2024-07-23 05:07:49.476852] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:18.429 [2024-07-23 05:07:49.477030] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.689 [2024-07-23 05:07:49.538364] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.689 [2024-07-23 05:07:49.554728] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:18.689 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.689 INFO: Seed: 3007097037 00:08:18.689 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:18.689 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:18.689 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:18.689 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.689 #2 INITED exec/s: 0 rss: 60Mb 00:08:18.689 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.689 This may also happen if the target rejected all inputs we tried so far 00:08:18.689 [2024-07-23 05:07:49.631488] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.689 [2024-07-23 05:07:49.631532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.689 [2024-07-23 05:07:49.631679] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.689 [2024-07-23 05:07:49.631701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.689 [2024-07-23 05:07:49.631846] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.689 [2024-07-23 05:07:49.631867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.257 NEW_FUNC[1/670]: 0x496290 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:19.258 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:19.258 #6 NEW cov: 11491 ft: 11491 corp: 2/22b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 4 CrossOver-EraseBytes-ChangeBit-InsertRepeatedBytes- 00:08:19.258 [2024-07-23 05:07:50.082822] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.082872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.258 [2024-07-23 05:07:50.083028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.083050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.258 [2024-07-23 05:07:50.083182] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.083203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.258 #7 NEW cov: 11604 ft: 12056 corp: 3/43b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 ChangeByte- 00:08:19.258 [2024-07-23 05:07:50.152815] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.152850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.258 [2024-07-23 05:07:50.153002] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.153024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.258 [2024-07-23 05:07:50.153174] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.153195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.258 #8 NEW cov: 11610 ft: 12204 corp: 4/68b lim: 35 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 CrossOver- 00:08:19.258 [2024-07-23 05:07:50.223098] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.223133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.258 [2024-07-23 05:07:50.223294] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.223315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.258 [2024-07-23 05:07:50.223456] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.223478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.258 #9 NEW cov: 11695 ft: 12555 corp: 5/89b lim: 35 exec/s: 0 rss: 67Mb L: 21/25 MS: 1 ChangeBinInt- 00:08:19.258 [2024-07-23 05:07:50.283483] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.283519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.258 [2024-07-23 05:07:50.283659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.283680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.258 [2024-07-23 05:07:50.283818] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.283841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.258 [2024-07-23 05:07:50.283985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.258 [2024-07-23 05:07:50.284006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.258 #10 NEW cov: 11695 ft: 13068 corp: 6/118b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 CopyPart- 00:08:19.517 [2024-07-23 05:07:50.353539] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.517 [2024-07-23 05:07:50.353574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.517 [2024-07-23 05:07:50.353730] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.517 [2024-07-23 05:07:50.353751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.517 [2024-07-23 05:07:50.353884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.517 [2024-07-23 05:07:50.353906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.517 #11 NEW cov: 11695 ft: 13196 corp: 7/143b lim: 35 exec/s: 0 rss: 67Mb L: 25/29 MS: 1 ChangeBit- 00:08:19.517 NEW_FUNC[1/1]: 0x4b60f0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:19.517 #12 NEW cov: 11709 ft: 13538 corp: 8/155b lim: 35 exec/s: 0 rss: 68Mb L: 12/29 MS: 1 InsertRepeatedBytes- 00:08:19.517 [2024-07-23 05:07:50.484117] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.517 [2024-07-23 05:07:50.484153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.517 [2024-07-23 05:07:50.484283] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.517 [2024-07-23 05:07:50.484306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.517 [2024-07-23 05:07:50.484451] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.517 [2024-07-23 05:07:50.484472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.517 [2024-07-23 05:07:50.484618] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.517 [2024-07-23 05:07:50.484638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.517 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:19.518 #13 NEW cov: 11732 ft: 13577 corp: 9/188b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:19.518 [2024-07-23 05:07:50.554136] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.518 [2024-07-23 05:07:50.554172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.518 [2024-07-23 05:07:50.554309] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.518 [2024-07-23 05:07:50.554331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.518 [2024-07-23 05:07:50.554481] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.518 [2024-07-23 05:07:50.554502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.518 #14 NEW cov: 11732 ft: 13594 corp: 10/209b lim: 35 exec/s: 0 rss: 68Mb L: 21/33 MS: 1 ChangeByte- 00:08:19.777 [2024-07-23 05:07:50.613881] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.777 [2024-07-23 05:07:50.613915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.777 #17 NEW cov: 11732 ft: 13748 corp: 11/217b lim: 35 exec/s: 17 rss: 68Mb L: 8/33 MS: 3 CrossOver-EraseBytes-CMP- DE: "\377\377\377\377"- 00:08:19.777 [2024-07-23 05:07:50.684736] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.777 [2024-07-23 05:07:50.684770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.777 [2024-07-23 05:07:50.684909] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.777 [2024-07-23 05:07:50.684929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.777 [2024-07-23 05:07:50.685073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.777 [2024-07-23 05:07:50.685097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.777 [2024-07-23 05:07:50.685240] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.777 [2024-07-23 05:07:50.685263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.777 #18 NEW cov: 11732 ft: 13773 corp: 12/248b lim: 35 exec/s: 18 rss: 68Mb L: 31/33 MS: 1 CrossOver- 00:08:19.777 [2024-07-23 05:07:50.744484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000006a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.777 [2024-07-23 05:07:50.744519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.777 [2024-07-23 05:07:50.744661] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.777 [2024-07-23 05:07:50.744683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.777 #21 NEW cov: 11732 ft: 13967 corp: 13/264b lim: 35 exec/s: 21 rss: 68Mb L: 16/33 MS: 3 ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:19.777 [2024-07-23 05:07:50.805086] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.777 [2024-07-23 05:07:50.805124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.777 [2024-07-23 05:07:50.805267] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.777 [2024-07-23 05:07:50.805288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.777 [2024-07-23 05:07:50.805413] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.777 [2024-07-23 05:07:50.805435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.777 [2024-07-23 05:07:50.805578] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.777 [2024-07-23 05:07:50.805599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.777 #22 NEW cov: 11732 ft: 14055 corp: 14/297b lim: 35 exec/s: 22 rss: 68Mb L: 33/33 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:20.037 [2024-07-23 05:07:50.874923] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:50.874959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.037 #23 NEW cov: 11732 ft: 14076 corp: 15/313b lim: 35 exec/s: 23 rss: 68Mb L: 16/33 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:20.037 [2024-07-23 05:07:50.945550] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:50.945588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.037 [2024-07-23 05:07:50.945733] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:50.945755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.037 [2024-07-23 05:07:50.945914] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:50.945933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.037 [2024-07-23 05:07:50.946084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:50.946106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.037 #24 NEW cov: 11732 ft: 14101 corp: 16/342b lim: 35 exec/s: 24 rss: 68Mb L: 29/33 MS: 1 CrossOver- 00:08:20.037 [2024-07-23 05:07:51.005506] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:51.005543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.037 [2024-07-23 05:07:51.005691] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:51.005718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.037 [2024-07-23 05:07:51.005867] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:51.005889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.037 #25 NEW cov: 11732 ft: 14174 corp: 17/363b lim: 35 exec/s: 25 rss: 68Mb L: 21/33 MS: 1 CopyPart- 00:08:20.037 [2024-07-23 05:07:51.065928] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:51.065964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.037 [2024-07-23 05:07:51.066112] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000036f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:51.066135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.037 [2024-07-23 05:07:51.066279] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000036f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:51.066300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.037 [2024-07-23 05:07:51.066457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000036f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.037 [2024-07-23 05:07:51.066481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.037 #26 NEW cov: 11732 ft: 14217 corp: 18/393b lim: 35 exec/s: 26 rss: 68Mb L: 30/33 MS: 1 InsertRepeatedBytes- 00:08:20.297 [2024-07-23 05:07:51.135940] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.135975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.297 [2024-07-23 05:07:51.136121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.136142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.297 [2024-07-23 05:07:51.136302] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.136324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.297 #27 NEW cov: 11732 ft: 14246 corp: 19/414b lim: 35 exec/s: 27 rss: 69Mb L: 21/33 MS: 1 ChangeBit- 00:08:20.297 [2024-07-23 05:07:51.196129] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.196163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.297 [2024-07-23 05:07:51.196320] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.196341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.297 [2024-07-23 05:07:51.196484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.196506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.297 #28 NEW cov: 11732 ft: 14253 corp: 20/435b lim: 35 exec/s: 28 rss: 69Mb L: 21/33 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:20.297 [2024-07-23 05:07:51.256285] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.256320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.297 [2024-07-23 05:07:51.256471] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.256494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.297 [2024-07-23 05:07:51.256649] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.256670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.297 #29 NEW cov: 11732 ft: 14257 corp: 21/457b lim: 35 exec/s: 29 rss: 69Mb L: 22/33 MS: 1 InsertByte- 00:08:20.297 [2024-07-23 05:07:51.306402] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.306437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.297 [2024-07-23 05:07:51.306577] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.306599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.297 [2024-07-23 05:07:51.306748] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.306770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.297 #30 NEW cov: 11732 ft: 14298 corp: 22/478b lim: 35 exec/s: 30 rss: 69Mb L: 21/33 MS: 1 ShuffleBytes- 00:08:20.297 [2024-07-23 05:07:51.376694] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.376730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.297 [2024-07-23 05:07:51.376870] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.376890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.297 [2024-07-23 05:07:51.377034] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.297 [2024-07-23 05:07:51.377055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.556 #31 NEW cov: 11732 ft: 14306 corp: 23/499b lim: 35 exec/s: 31 rss: 69Mb L: 21/33 MS: 1 CrossOver- 00:08:20.556 [2024-07-23 05:07:51.446811] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.556 [2024-07-23 05:07:51.446847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.556 #32 NEW cov: 11732 ft: 14334 corp: 24/515b lim: 35 exec/s: 32 rss: 69Mb L: 16/33 MS: 1 CrossOver- 00:08:20.557 [2024-07-23 05:07:51.516970] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.557 [2024-07-23 05:07:51.517005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.557 [2024-07-23 05:07:51.517142] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.557 [2024-07-23 05:07:51.517167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.557 #33 NEW cov: 11732 ft: 14339 corp: 25/533b lim: 35 exec/s: 33 rss: 69Mb L: 18/33 MS: 1 EraseBytes- 00:08:20.557 [2024-07-23 05:07:51.577607] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000006a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.557 [2024-07-23 05:07:51.577643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.557 [2024-07-23 05:07:51.577780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.557 [2024-07-23 05:07:51.577802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.557 [2024-07-23 05:07:51.577941] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000494 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.557 [2024-07-23 05:07:51.577962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.557 [2024-07-23 05:07:51.578100] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000094 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.557 [2024-07-23 05:07:51.578120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.557 #34 NEW cov: 11732 ft: 14360 corp: 26/561b lim: 35 exec/s: 17 rss: 69Mb L: 28/33 MS: 1 InsertRepeatedBytes- 00:08:20.557 #34 DONE cov: 11732 ft: 14360 corp: 26/561b lim: 35 exec/s: 17 rss: 69Mb 00:08:20.557 ###### Recommended dictionary. ###### 00:08:20.557 "\377\377\377\377" # Uses: 3 00:08:20.557 ###### End of recommended dictionary. ###### 00:08:20.557 Done 34 runs in 2 second(s) 00:08:20.816 05:07:51 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:20.816 05:07:51 -- ../common.sh@72 -- # (( i++ )) 00:08:20.816 05:07:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.816 05:07:51 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:20.816 05:07:51 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:20.816 05:07:51 -- nvmf/run.sh@24 -- # local timen=1 00:08:20.816 05:07:51 -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.816 05:07:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:20.816 05:07:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:20.816 05:07:51 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:20.816 05:07:51 -- nvmf/run.sh@29 -- # port=4416 00:08:20.816 05:07:51 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:20.816 05:07:51 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:20.816 05:07:51 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.816 05:07:51 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:20.816 [2024-07-23 05:07:51.800988] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:20.816 [2024-07-23 05:07:51.801059] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3146198 ] 00:08:20.816 EAL: No free 2048 kB hugepages reported on node 1 00:08:21.076 [2024-07-23 05:07:52.017425] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.076 [2024-07-23 05:07:52.093054] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:21.076 [2024-07-23 05:07:52.093231] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.076 [2024-07-23 05:07:52.154303] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:21.335 [2024-07-23 05:07:52.170649] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:21.335 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.335 INFO: Seed: 1329125168 00:08:21.335 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:21.335 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:21.335 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:21.335 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.335 #2 INITED exec/s: 0 rss: 60Mb 00:08:21.335 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.335 This may also happen if the target rejected all inputs we tried so far 00:08:21.335 [2024-07-23 05:07:52.226231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.335 [2024-07-23 05:07:52.226271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.335 [2024-07-23 05:07:52.226302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.335 [2024-07-23 05:07:52.226322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.335 [2024-07-23 05:07:52.226390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.335 [2024-07-23 05:07:52.226411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.595 NEW_FUNC[1/670]: 0x497740 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:21.595 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.595 #4 NEW cov: 11582 ft: 11571 corp: 2/77b lim: 105 exec/s: 0 rss: 66Mb L: 76/76 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:21.595 [2024-07-23 05:07:52.667090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.595 [2024-07-23 05:07:52.667134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.595 [2024-07-23 05:07:52.667200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.595 [2024-07-23 05:07:52.667220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.855 NEW_FUNC[1/1]: 0x16bfde0 in _nvme_qpair_complete_abort_queued_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:593 00:08:21.855 #15 NEW cov: 11707 ft: 12413 corp: 3/129b lim: 105 exec/s: 0 rss: 67Mb L: 52/76 MS: 1 InsertRepeatedBytes- 00:08:21.855 [2024-07-23 05:07:52.717159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.717196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.855 [2024-07-23 05:07:52.717255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.717276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.855 #16 NEW cov: 11713 ft: 12722 corp: 4/181b lim: 105 exec/s: 0 rss: 67Mb L: 52/76 MS: 1 ChangeBit- 00:08:21.855 [2024-07-23 05:07:52.767717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.767752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.855 [2024-07-23 05:07:52.767811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2314885530818453536 len:8225 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.767830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.855 [2024-07-23 05:07:52.767892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2314885530818453536 len:8225 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.767911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.855 [2024-07-23 05:07:52.767973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2314885487868780576 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.767992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.855 [2024-07-23 05:07:52.768055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.768075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.855 #17 NEW cov: 11798 ft: 13456 corp: 5/286b lim: 105 exec/s: 0 rss: 67Mb L: 105/105 MS: 1 InsertRepeatedBytes- 00:08:21.855 [2024-07-23 05:07:52.817583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.817617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.855 [2024-07-23 05:07:52.817661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.817681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.855 [2024-07-23 05:07:52.817743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.817765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.855 #18 NEW cov: 11798 ft: 13523 corp: 6/359b lim: 105 exec/s: 0 rss: 67Mb L: 73/105 MS: 1 CrossOver- 00:08:21.855 [2024-07-23 05:07:52.867692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.867728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.855 [2024-07-23 05:07:52.867774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.867794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.855 [2024-07-23 05:07:52.867857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.867877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.855 #19 NEW cov: 11798 ft: 13567 corp: 7/435b lim: 105 exec/s: 0 rss: 67Mb L: 76/105 MS: 1 CopyPart- 00:08:21.855 [2024-07-23 05:07:52.917738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.917772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.855 [2024-07-23 05:07:52.917821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.855 [2024-07-23 05:07:52.917839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.115 #20 NEW cov: 11798 ft: 13731 corp: 8/487b lim: 105 exec/s: 0 rss: 67Mb L: 52/105 MS: 1 ChangeBinInt- 00:08:22.115 [2024-07-23 05:07:52.968160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:52.968196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.115 [2024-07-23 05:07:52.968249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:52.968269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.115 [2024-07-23 05:07:52.968331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1591483802437688854 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:52.968351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.115 [2024-07-23 05:07:52.968413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1591483836797425174 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:52.968432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.115 #21 NEW cov: 11798 ft: 13776 corp: 9/580b lim: 105 exec/s: 0 rss: 68Mb L: 93/105 MS: 1 CrossOver- 00:08:22.115 [2024-07-23 05:07:53.027886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:53.027920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.115 #22 NEW cov: 11798 ft: 14249 corp: 10/621b lim: 105 exec/s: 0 rss: 68Mb L: 41/105 MS: 1 EraseBytes- 00:08:22.115 [2024-07-23 05:07:53.078157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:53.078192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.115 [2024-07-23 05:07:53.078231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:53.078250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.115 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:22.115 #23 NEW cov: 11821 ft: 14309 corp: 11/673b lim: 105 exec/s: 0 rss: 68Mb L: 52/105 MS: 1 ChangeBit- 00:08:22.115 [2024-07-23 05:07:53.128552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:53.128586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.115 [2024-07-23 05:07:53.128646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:53.128669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.115 [2024-07-23 05:07:53.128730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:53.128750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.115 [2024-07-23 05:07:53.128813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:53.128833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.115 #24 NEW cov: 11821 ft: 14384 corp: 12/766b lim: 105 exec/s: 0 rss: 68Mb L: 93/105 MS: 1 CopyPart- 00:08:22.115 [2024-07-23 05:07:53.188538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:53.188571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.115 [2024-07-23 05:07:53.188611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2314885530818453536 len:8225 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.115 [2024-07-23 05:07:53.188632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.373 #25 NEW cov: 11821 ft: 14417 corp: 13/824b lim: 105 exec/s: 25 rss: 68Mb L: 58/105 MS: 1 EraseBytes- 00:08:22.373 [2024-07-23 05:07:53.248694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.373 [2024-07-23 05:07:53.248728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.373 [2024-07-23 05:07:53.248779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483793847752214 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.373 [2024-07-23 05:07:53.248799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.373 #26 NEW cov: 11821 ft: 14492 corp: 14/876b lim: 105 exec/s: 26 rss: 68Mb L: 52/105 MS: 1 ChangeBit- 00:08:22.373 [2024-07-23 05:07:53.298827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.373 [2024-07-23 05:07:53.298861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.373 [2024-07-23 05:07:53.298904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5854 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.373 [2024-07-23 05:07:53.298923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.373 #27 NEW cov: 11821 ft: 14526 corp: 15/928b lim: 105 exec/s: 27 rss: 68Mb L: 52/105 MS: 1 ChangeByte- 00:08:22.373 [2024-07-23 05:07:53.339318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.373 [2024-07-23 05:07:53.339351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.373 [2024-07-23 05:07:53.339417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2314885530818453536 len:8225 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.373 [2024-07-23 05:07:53.339437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.373 [2024-07-23 05:07:53.339510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2314885530818453536 len:8225 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.373 [2024-07-23 05:07:53.339530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.374 [2024-07-23 05:07:53.339595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2314885487868780576 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.374 [2024-07-23 05:07:53.339614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.374 [2024-07-23 05:07:53.339678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:1607661020096370198 len:58161 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.374 [2024-07-23 05:07:53.339698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:22.374 #28 NEW cov: 11821 ft: 14547 corp: 16/1033b lim: 105 exec/s: 28 rss: 68Mb L: 105/105 MS: 1 CMP- DE: "O\217.tZ\3430\000"- 00:08:22.374 [2024-07-23 05:07:53.389091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.374 [2024-07-23 05:07:53.389125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.374 [2024-07-23 05:07:53.389177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2314885530818453536 len:8225 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.374 [2024-07-23 05:07:53.389197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.374 #29 NEW cov: 11821 ft: 14584 corp: 17/1091b lim: 105 exec/s: 29 rss: 68Mb L: 58/105 MS: 1 CopyPart- 00:08:22.374 [2024-07-23 05:07:53.439226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591484047250822678 len:29787 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.374 [2024-07-23 05:07:53.439260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.374 [2024-07-23 05:07:53.439310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2314885530818453536 len:8225 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.374 [2024-07-23 05:07:53.439330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.632 #30 NEW cov: 11821 ft: 14665 corp: 18/1149b lim: 105 exec/s: 30 rss: 68Mb L: 58/105 MS: 1 PersAutoDict- DE: "O\217.tZ\3430\000"- 00:08:22.632 [2024-07-23 05:07:53.489626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.632 [2024-07-23 05:07:53.489660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.632 [2024-07-23 05:07:53.489719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16855260462145810432 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.632 [2024-07-23 05:07:53.489739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.632 [2024-07-23 05:07:53.489799] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1591483802437686806 len:7703 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.632 [2024-07-23 05:07:53.489819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.632 [2024-07-23 05:07:53.489882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.632 [2024-07-23 05:07:53.489902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.632 #31 NEW cov: 11821 ft: 14793 corp: 19/1250b lim: 105 exec/s: 31 rss: 68Mb L: 101/105 MS: 1 PersAutoDict- DE: "O\217.tZ\3430\000"- 00:08:22.632 [2024-07-23 05:07:53.539593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.632 [2024-07-23 05:07:53.539628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.632 [2024-07-23 05:07:53.539687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.632 [2024-07-23 05:07:53.539706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.633 [2024-07-23 05:07:53.539767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.633 [2024-07-23 05:07:53.539787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.633 #32 NEW cov: 11821 ft: 14811 corp: 20/1327b lim: 105 exec/s: 32 rss: 68Mb L: 77/105 MS: 1 InsertByte- 00:08:22.633 [2024-07-23 05:07:53.589677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.633 [2024-07-23 05:07:53.589711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.633 [2024-07-23 05:07:53.589752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.633 [2024-07-23 05:07:53.589774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.633 #33 NEW cov: 11821 ft: 14824 corp: 21/1379b lim: 105 exec/s: 33 rss: 68Mb L: 52/105 MS: 1 ChangeBit- 00:08:22.633 [2024-07-23 05:07:53.629721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.633 [2024-07-23 05:07:53.629754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.633 [2024-07-23 05:07:53.629796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.633 [2024-07-23 05:07:53.629815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.633 #36 NEW cov: 11821 ft: 14841 corp: 22/1423b lim: 105 exec/s: 36 rss: 68Mb L: 44/105 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:22.633 [2024-07-23 05:07:53.670111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.633 [2024-07-23 05:07:53.670144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.633 [2024-07-23 05:07:53.670201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.633 [2024-07-23 05:07:53.670221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.633 [2024-07-23 05:07:53.670284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.633 [2024-07-23 05:07:53.670303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.633 [2024-07-23 05:07:53.670368] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.633 [2024-07-23 05:07:53.670390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.633 #37 NEW cov: 11821 ft: 14857 corp: 23/1516b lim: 105 exec/s: 37 rss: 69Mb L: 93/105 MS: 1 ChangeBinInt- 00:08:22.892 [2024-07-23 05:07:53.730041] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.892 [2024-07-23 05:07:53.730076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.892 [2024-07-23 05:07:53.730128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.892 [2024-07-23 05:07:53.730148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.892 #38 NEW cov: 11821 ft: 14875 corp: 24/1578b lim: 105 exec/s: 38 rss: 69Mb L: 62/105 MS: 1 CopyPart- 00:08:22.892 [2024-07-23 05:07:53.780065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.892 [2024-07-23 05:07:53.780100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.892 #39 NEW cov: 11821 ft: 14889 corp: 25/1619b lim: 105 exec/s: 39 rss: 69Mb L: 41/105 MS: 1 CopyPart- 00:08:22.892 [2024-07-23 05:07:53.840532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.892 [2024-07-23 05:07:53.840568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.892 [2024-07-23 05:07:53.840607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.892 [2024-07-23 05:07:53.840627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.892 [2024-07-23 05:07:53.840689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.892 [2024-07-23 05:07:53.840710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.892 #40 NEW cov: 11821 ft: 14904 corp: 26/1696b lim: 105 exec/s: 40 rss: 69Mb L: 77/105 MS: 1 ShuffleBytes- 00:08:22.892 [2024-07-23 05:07:53.900605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.892 [2024-07-23 05:07:53.900641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.892 [2024-07-23 05:07:53.900678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.892 [2024-07-23 05:07:53.900699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.892 #41 NEW cov: 11821 ft: 14915 corp: 27/1748b lim: 105 exec/s: 41 rss: 69Mb L: 52/105 MS: 1 ShuffleBytes- 00:08:22.892 [2024-07-23 05:07:53.950700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.892 [2024-07-23 05:07:53.950735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.892 [2024-07-23 05:07:53.950788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:6679 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.892 [2024-07-23 05:07:53.950808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.892 #42 NEW cov: 11821 ft: 14923 corp: 28/1800b lim: 105 exec/s: 42 rss: 69Mb L: 52/105 MS: 1 ChangeBinInt- 00:08:23.151 [2024-07-23 05:07:53.990813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:53.990849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.151 [2024-07-23 05:07:53.990901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:53.990921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.151 #43 NEW cov: 11821 ft: 14931 corp: 29/1852b lim: 105 exec/s: 43 rss: 69Mb L: 52/105 MS: 1 CopyPart- 00:08:23.151 [2024-07-23 05:07:54.030902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.030935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.151 [2024-07-23 05:07:54.030978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:6679 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.030997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.151 #44 NEW cov: 11821 ft: 14948 corp: 30/1904b lim: 105 exec/s: 44 rss: 69Mb L: 52/105 MS: 1 ChangeBinInt- 00:08:23.151 [2024-07-23 05:07:54.081201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.081236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.151 [2024-07-23 05:07:54.081284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.081303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.151 [2024-07-23 05:07:54.081365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.081384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.151 #45 NEW cov: 11821 ft: 14963 corp: 31/1980b lim: 105 exec/s: 45 rss: 69Mb L: 76/105 MS: 1 CopyPart- 00:08:23.151 [2024-07-23 05:07:54.121304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.121338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.151 [2024-07-23 05:07:54.121380] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.121400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.151 [2024-07-23 05:07:54.121472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.121491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.151 #46 NEW cov: 11821 ft: 14978 corp: 32/2056b lim: 105 exec/s: 46 rss: 69Mb L: 76/105 MS: 1 ChangeByte- 00:08:23.151 [2024-07-23 05:07:54.161595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.161633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.151 [2024-07-23 05:07:54.161676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4617902752030660118 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.161695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.151 [2024-07-23 05:07:54.161757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.161776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.151 [2024-07-23 05:07:54.161839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.161859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.151 #47 NEW cov: 11821 ft: 14982 corp: 33/2149b lim: 105 exec/s: 47 rss: 69Mb L: 93/105 MS: 1 ChangeByte- 00:08:23.151 [2024-07-23 05:07:54.221526] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.221561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.151 [2024-07-23 05:07:54.221600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686790 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.151 [2024-07-23 05:07:54.221619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.411 #48 NEW cov: 11821 ft: 14985 corp: 34/2201b lim: 105 exec/s: 24 rss: 70Mb L: 52/105 MS: 1 ChangeBit- 00:08:23.411 #48 DONE cov: 11821 ft: 14985 corp: 34/2201b lim: 105 exec/s: 24 rss: 70Mb 00:08:23.411 ###### Recommended dictionary. ###### 00:08:23.411 "O\217.tZ\3430\000" # Uses: 2 00:08:23.411 ###### End of recommended dictionary. ###### 00:08:23.411 Done 48 runs in 2 second(s) 00:08:23.411 05:07:54 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:23.411 05:07:54 -- ../common.sh@72 -- # (( i++ )) 00:08:23.411 05:07:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.411 05:07:54 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:23.411 05:07:54 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:23.411 05:07:54 -- nvmf/run.sh@24 -- # local timen=1 00:08:23.411 05:07:54 -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.411 05:07:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:23.411 05:07:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:23.411 05:07:54 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:23.411 05:07:54 -- nvmf/run.sh@29 -- # port=4417 00:08:23.411 05:07:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:23.411 05:07:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:23.411 05:07:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.411 05:07:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:23.411 [2024-07-23 05:07:54.431405] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:23.411 [2024-07-23 05:07:54.431476] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3146741 ] 00:08:23.411 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.670 [2024-07-23 05:07:54.644566] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.670 [2024-07-23 05:07:54.719749] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:23.670 [2024-07-23 05:07:54.719923] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.930 [2024-07-23 05:07:54.781440] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.930 [2024-07-23 05:07:54.797814] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:23.930 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.930 INFO: Seed: 3954132180 00:08:23.930 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:23.930 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:23.930 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:23.930 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.930 #2 INITED exec/s: 0 rss: 60Mb 00:08:23.930 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.930 This may also happen if the target rejected all inputs we tried so far 00:08:23.930 [2024-07-23 05:07:54.847527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.930 [2024-07-23 05:07:54.847567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.930 [2024-07-23 05:07:54.847609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.930 [2024-07-23 05:07:54.847630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.930 [2024-07-23 05:07:54.847692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.930 [2024-07-23 05:07:54.847713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.930 [2024-07-23 05:07:54.847780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.930 [2024-07-23 05:07:54.847801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.189 NEW_FUNC[1/669]: 0x49aa30 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:24.189 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:24.189 #15 NEW cov: 11603 ft: 11604 corp: 2/105b lim: 120 exec/s: 0 rss: 67Mb L: 104/104 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:24.449 [2024-07-23 05:07:55.288546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.288590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.449 [2024-07-23 05:07:55.288623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.288644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.449 [2024-07-23 05:07:55.288705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.288729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.449 [2024-07-23 05:07:55.288789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.288808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.449 NEW_FUNC[1/3]: 0xf686f0 in posix_sock_flush /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1441 00:08:24.449 NEW_FUNC[2/3]: 0x16c4af0 in nvme_qpair_is_admin_queue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1090 00:08:24.449 #21 NEW cov: 11728 ft: 12207 corp: 3/207b lim: 120 exec/s: 0 rss: 67Mb L: 102/104 MS: 1 EraseBytes- 00:08:24.449 [2024-07-23 05:07:55.348615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.348650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.449 [2024-07-23 05:07:55.348695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.348715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.449 [2024-07-23 05:07:55.348778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.348798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.449 [2024-07-23 05:07:55.348857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.348878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.449 #22 NEW cov: 11734 ft: 12416 corp: 4/311b lim: 120 exec/s: 0 rss: 67Mb L: 104/104 MS: 1 ChangeBinInt- 00:08:24.449 [2024-07-23 05:07:55.388164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3959422976 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.388201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.449 #24 NEW cov: 11819 ft: 13569 corp: 5/354b lim: 120 exec/s: 0 rss: 67Mb L: 43/104 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:24.449 [2024-07-23 05:07:55.438554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3959422976 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.438589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.449 [2024-07-23 05:07:55.438633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.438653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.449 #25 NEW cov: 11819 ft: 13978 corp: 6/415b lim: 120 exec/s: 0 rss: 67Mb L: 61/104 MS: 1 InsertRepeatedBytes- 00:08:24.449 [2024-07-23 05:07:55.499071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.499105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.449 [2024-07-23 05:07:55.499162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.499188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.449 [2024-07-23 05:07:55.499250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.499268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.449 [2024-07-23 05:07:55.499329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.449 [2024-07-23 05:07:55.499349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.449 #26 NEW cov: 11819 ft: 14128 corp: 7/517b lim: 120 exec/s: 0 rss: 67Mb L: 102/104 MS: 1 ChangeBit- 00:08:24.709 [2024-07-23 05:07:55.559189] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.559223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.559282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.559302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.559361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.559382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.559449] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:2816 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.559469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.709 #27 NEW cov: 11819 ft: 14181 corp: 8/621b lim: 120 exec/s: 0 rss: 68Mb L: 104/104 MS: 1 CopyPart- 00:08:24.709 [2024-07-23 05:07:55.619366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.619401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.619466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.619485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.619547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.619567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.619625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.619645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.709 #28 NEW cov: 11819 ft: 14214 corp: 9/723b lim: 120 exec/s: 0 rss: 68Mb L: 102/104 MS: 1 ChangeBit- 00:08:24.709 [2024-07-23 05:07:55.669506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.669545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.669589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.669609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.669672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.669694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.669755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.669777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.709 #29 NEW cov: 11819 ft: 14245 corp: 10/825b lim: 120 exec/s: 0 rss: 68Mb L: 102/104 MS: 1 ChangeBit- 00:08:24.709 [2024-07-23 05:07:55.719502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3959422976 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.719537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.719578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.719598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.719659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18374686483966590975 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.719679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.709 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:24.709 #30 NEW cov: 11842 ft: 14589 corp: 11/901b lim: 120 exec/s: 0 rss: 68Mb L: 76/104 MS: 1 InsertRepeatedBytes- 00:08:24.709 [2024-07-23 05:07:55.779838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:281473550583808 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.779872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.779918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.779939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.780002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.780023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.709 [2024-07-23 05:07:55.780084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.709 [2024-07-23 05:07:55.780104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.997 #31 NEW cov: 11842 ft: 14616 corp: 12/1005b lim: 120 exec/s: 0 rss: 68Mb L: 104/104 MS: 1 ChangeBinInt- 00:08:24.997 [2024-07-23 05:07:55.819646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.997 [2024-07-23 05:07:55.819681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.997 [2024-07-23 05:07:55.819730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.997 [2024-07-23 05:07:55.819749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.997 #34 NEW cov: 11842 ft: 14637 corp: 13/1063b lim: 120 exec/s: 34 rss: 68Mb L: 58/104 MS: 3 CrossOver-ShuffleBytes-CrossOver- 00:08:24.997 [2024-07-23 05:07:55.869606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.997 [2024-07-23 05:07:55.869641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.997 #38 NEW cov: 11842 ft: 14726 corp: 14/1104b lim: 120 exec/s: 38 rss: 68Mb L: 41/104 MS: 4 ShuffleBytes-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:24.997 [2024-07-23 05:07:55.909865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.997 [2024-07-23 05:07:55.909900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.997 [2024-07-23 05:07:55.909941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.997 [2024-07-23 05:07:55.909960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.997 #39 NEW cov: 11842 ft: 14760 corp: 15/1162b lim: 120 exec/s: 39 rss: 68Mb L: 58/104 MS: 1 ChangeByte- 00:08:24.997 [2024-07-23 05:07:55.970422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.997 [2024-07-23 05:07:55.970462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.997 [2024-07-23 05:07:55.970515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.997 [2024-07-23 05:07:55.970535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.997 [2024-07-23 05:07:55.970596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.997 [2024-07-23 05:07:55.970616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.997 [2024-07-23 05:07:55.970678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.997 [2024-07-23 05:07:55.970697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.997 #40 NEW cov: 11842 ft: 14878 corp: 16/1264b lim: 120 exec/s: 40 rss: 68Mb L: 102/104 MS: 1 ShuffleBytes- 00:08:24.997 [2024-07-23 05:07:56.010179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.997 [2024-07-23 05:07:56.010213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.997 [2024-07-23 05:07:56.010249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446601927471923199 len:16121 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.997 [2024-07-23 05:07:56.010273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.997 #41 NEW cov: 11842 ft: 14915 corp: 17/1322b lim: 120 exec/s: 41 rss: 68Mb L: 58/104 MS: 1 CMP- DE: "\377\377~\267x\024>\370"- 00:08:25.256 [2024-07-23 05:07:56.070218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070421217279 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.256 [2024-07-23 05:07:56.070255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.256 #43 NEW cov: 11842 ft: 14964 corp: 18/1367b lim: 120 exec/s: 43 rss: 68Mb L: 45/104 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:25.256 [2024-07-23 05:07:56.110845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.256 [2024-07-23 05:07:56.110879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.256 [2024-07-23 05:07:56.110935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.256 [2024-07-23 05:07:56.110956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.256 [2024-07-23 05:07:56.111019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.256 [2024-07-23 05:07:56.111039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.256 [2024-07-23 05:07:56.111100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.256 [2024-07-23 05:07:56.111119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.256 #44 NEW cov: 11842 ft: 15017 corp: 19/1465b lim: 120 exec/s: 44 rss: 68Mb L: 98/104 MS: 1 EraseBytes- 00:08:25.256 [2024-07-23 05:07:56.150417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073374007296 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.256 [2024-07-23 05:07:56.150456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.256 #45 NEW cov: 11842 ft: 15034 corp: 20/1508b lim: 120 exec/s: 45 rss: 68Mb L: 43/104 MS: 1 CrossOver- 00:08:25.256 [2024-07-23 05:07:56.200923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.256 [2024-07-23 05:07:56.200957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.256 [2024-07-23 05:07:56.201006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.256 [2024-07-23 05:07:56.201026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.256 [2024-07-23 05:07:56.201087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.256 [2024-07-23 05:07:56.201107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.257 #46 NEW cov: 11842 ft: 15068 corp: 21/1603b lim: 120 exec/s: 46 rss: 69Mb L: 95/104 MS: 1 InsertRepeatedBytes- 00:08:25.257 [2024-07-23 05:07:56.260726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18374697471499567103 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.257 [2024-07-23 05:07:56.260765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.257 #47 NEW cov: 11842 ft: 15137 corp: 22/1648b lim: 120 exec/s: 47 rss: 69Mb L: 45/104 MS: 1 ChangeBinInt- 00:08:25.257 [2024-07-23 05:07:56.321405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.257 [2024-07-23 05:07:56.321440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.257 [2024-07-23 05:07:56.321504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.257 [2024-07-23 05:07:56.321524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.257 [2024-07-23 05:07:56.321584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.257 [2024-07-23 05:07:56.321604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.257 [2024-07-23 05:07:56.321667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.257 [2024-07-23 05:07:56.321686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.257 #48 NEW cov: 11842 ft: 15142 corp: 23/1750b lim: 120 exec/s: 48 rss: 69Mb L: 102/104 MS: 1 ChangeBit- 00:08:25.516 [2024-07-23 05:07:56.361166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.361199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.361245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446601927471923199 len:16121 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.361265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.516 #49 NEW cov: 11842 ft: 15145 corp: 24/1813b lim: 120 exec/s: 49 rss: 69Mb L: 63/104 MS: 1 InsertRepeatedBytes- 00:08:25.516 [2024-07-23 05:07:56.411615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.411649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.411708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.411727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.411788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.411808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.411871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:2816 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.411891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.516 #50 NEW cov: 11842 ft: 15153 corp: 25/1917b lim: 120 exec/s: 50 rss: 69Mb L: 104/104 MS: 1 CopyPart- 00:08:25.516 [2024-07-23 05:07:56.461786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283446783 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.461820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.461877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.461897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.461959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.461979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.462039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.462058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.516 #51 NEW cov: 11842 ft: 15155 corp: 26/2022b lim: 120 exec/s: 51 rss: 69Mb L: 105/105 MS: 1 InsertByte- 00:08:25.516 [2024-07-23 05:07:56.501592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073374007296 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.501625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.501664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.501685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.516 #52 NEW cov: 11842 ft: 15162 corp: 27/2084b lim: 120 exec/s: 52 rss: 69Mb L: 62/105 MS: 1 CopyPart- 00:08:25.516 [2024-07-23 05:07:56.562139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.562172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.562233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.562253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.562312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.562330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.562392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.562412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.516 #53 NEW cov: 11842 ft: 15176 corp: 28/2198b lim: 120 exec/s: 53 rss: 69Mb L: 114/114 MS: 1 CopyPart- 00:08:25.516 [2024-07-23 05:07:56.602248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283446783 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.602281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.602324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.602342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.602402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.602423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.516 [2024-07-23 05:07:56.602490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.516 [2024-07-23 05:07:56.602510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.775 #54 NEW cov: 11842 ft: 15187 corp: 29/2315b lim: 120 exec/s: 54 rss: 69Mb L: 117/117 MS: 1 CopyPart- 00:08:25.775 [2024-07-23 05:07:56.652353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9130898799680552959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.775 [2024-07-23 05:07:56.652386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.775 [2024-07-23 05:07:56.652450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.775 [2024-07-23 05:07:56.652469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.775 [2024-07-23 05:07:56.652531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.775 [2024-07-23 05:07:56.652550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.776 [2024-07-23 05:07:56.652609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.776 [2024-07-23 05:07:56.652630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.776 #55 NEW cov: 11842 ft: 15204 corp: 30/2417b lim: 120 exec/s: 55 rss: 70Mb L: 102/117 MS: 1 PersAutoDict- DE: "\377\377~\267x\024>\370"- 00:08:25.776 [2024-07-23 05:07:56.702178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.776 [2024-07-23 05:07:56.702211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.776 [2024-07-23 05:07:56.702252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9130898801106616319 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.776 [2024-07-23 05:07:56.702272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.776 #56 NEW cov: 11842 ft: 15271 corp: 31/2488b lim: 120 exec/s: 56 rss: 70Mb L: 71/117 MS: 1 PersAutoDict- DE: "\377\377~\267x\024>\370"- 00:08:25.776 [2024-07-23 05:07:56.762325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4179340458159243264 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.776 [2024-07-23 05:07:56.762358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.776 [2024-07-23 05:07:56.762399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.776 [2024-07-23 05:07:56.762422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.776 #57 NEW cov: 11842 ft: 15304 corp: 32/2550b lim: 120 exec/s: 57 rss: 70Mb L: 62/117 MS: 1 CrossOver- 00:08:25.776 [2024-07-23 05:07:56.822722] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.776 [2024-07-23 05:07:56.822754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.776 [2024-07-23 05:07:56.822797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.776 [2024-07-23 05:07:56.822817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.776 [2024-07-23 05:07:56.822878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.776 [2024-07-23 05:07:56.822897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.776 #58 NEW cov: 11842 ft: 15332 corp: 33/2645b lim: 120 exec/s: 29 rss: 70Mb L: 95/117 MS: 1 CopyPart- 00:08:25.776 #58 DONE cov: 11842 ft: 15332 corp: 33/2645b lim: 120 exec/s: 29 rss: 70Mb 00:08:25.776 ###### Recommended dictionary. ###### 00:08:25.776 "\377\377~\267x\024>\370" # Uses: 2 00:08:25.776 ###### End of recommended dictionary. ###### 00:08:25.776 Done 58 runs in 2 second(s) 00:08:26.035 05:07:56 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:26.035 05:07:56 -- ../common.sh@72 -- # (( i++ )) 00:08:26.035 05:07:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:26.035 05:07:56 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:26.035 05:07:57 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:26.035 05:07:57 -- nvmf/run.sh@24 -- # local timen=1 00:08:26.035 05:07:57 -- nvmf/run.sh@25 -- # local core=0x1 00:08:26.035 05:07:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:26.035 05:07:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:26.035 05:07:57 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:26.035 05:07:57 -- nvmf/run.sh@29 -- # port=4418 00:08:26.035 05:07:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:26.035 05:07:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:26.035 05:07:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:26.035 05:07:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:26.035 [2024-07-23 05:07:57.042827] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:26.035 [2024-07-23 05:07:57.042898] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3147068 ] 00:08:26.035 EAL: No free 2048 kB hugepages reported on node 1 00:08:26.294 [2024-07-23 05:07:57.262867] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.294 [2024-07-23 05:07:57.338899] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:26.294 [2024-07-23 05:07:57.339077] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.553 [2024-07-23 05:07:57.400432] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:26.553 [2024-07-23 05:07:57.416810] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:26.553 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.553 INFO: Seed: 2278164307 00:08:26.553 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:26.553 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:26.553 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:26.553 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.553 #2 INITED exec/s: 0 rss: 60Mb 00:08:26.553 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.553 This may also happen if the target rejected all inputs we tried so far 00:08:26.553 [2024-07-23 05:07:57.465986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.553 [2024-07-23 05:07:57.466022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.553 [2024-07-23 05:07:57.466071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.553 [2024-07-23 05:07:57.466090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.812 NEW_FUNC[1/670]: 0x49e290 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:26.812 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.812 #9 NEW cov: 11555 ft: 11559 corp: 2/49b lim: 100 exec/s: 0 rss: 67Mb L: 48/48 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:26.812 [2024-07-23 05:07:57.896962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.812 [2024-07-23 05:07:57.897003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.071 #10 NEW cov: 11672 ft: 12427 corp: 3/88b lim: 100 exec/s: 0 rss: 67Mb L: 39/48 MS: 1 InsertRepeatedBytes- 00:08:27.071 [2024-07-23 05:07:57.947105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.071 [2024-07-23 05:07:57.947139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.071 [2024-07-23 05:07:57.947182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.071 [2024-07-23 05:07:57.947200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.071 #11 NEW cov: 11678 ft: 12698 corp: 4/137b lim: 100 exec/s: 0 rss: 67Mb L: 49/49 MS: 1 InsertByte- 00:08:27.071 [2024-07-23 05:07:57.997216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.071 [2024-07-23 05:07:57.997249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.071 [2024-07-23 05:07:57.997297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.071 [2024-07-23 05:07:57.997316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.071 #12 NEW cov: 11763 ft: 12902 corp: 5/177b lim: 100 exec/s: 0 rss: 67Mb L: 40/49 MS: 1 InsertByte- 00:08:27.071 [2024-07-23 05:07:58.047232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.071 [2024-07-23 05:07:58.047264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.071 #18 NEW cov: 11763 ft: 13043 corp: 6/198b lim: 100 exec/s: 0 rss: 67Mb L: 21/49 MS: 1 InsertRepeatedBytes- 00:08:27.071 [2024-07-23 05:07:58.087772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.071 [2024-07-23 05:07:58.087805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.071 [2024-07-23 05:07:58.087851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.071 [2024-07-23 05:07:58.087873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.071 [2024-07-23 05:07:58.087931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:27.071 [2024-07-23 05:07:58.087949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.071 [2024-07-23 05:07:58.088009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:27.071 [2024-07-23 05:07:58.088027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.071 #20 NEW cov: 11763 ft: 13437 corp: 7/296b lim: 100 exec/s: 0 rss: 67Mb L: 98/98 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:27.071 [2024-07-23 05:07:58.137867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.071 [2024-07-23 05:07:58.137900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.071 [2024-07-23 05:07:58.137941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.071 [2024-07-23 05:07:58.137959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.071 [2024-07-23 05:07:58.138017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:27.071 [2024-07-23 05:07:58.138035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.071 [2024-07-23 05:07:58.138094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:27.071 [2024-07-23 05:07:58.138112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.330 #21 NEW cov: 11763 ft: 13499 corp: 8/394b lim: 100 exec/s: 0 rss: 67Mb L: 98/98 MS: 1 ShuffleBytes- 00:08:27.330 [2024-07-23 05:07:58.197804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.331 [2024-07-23 05:07:58.197836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.331 [2024-07-23 05:07:58.197878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.331 [2024-07-23 05:07:58.197897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.331 #22 NEW cov: 11763 ft: 13533 corp: 9/443b lim: 100 exec/s: 0 rss: 68Mb L: 49/98 MS: 1 ChangeBit- 00:08:27.331 [2024-07-23 05:07:58.247926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.331 [2024-07-23 05:07:58.247958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.331 [2024-07-23 05:07:58.248004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.331 [2024-07-23 05:07:58.248023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.331 #23 NEW cov: 11763 ft: 13555 corp: 10/483b lim: 100 exec/s: 0 rss: 68Mb L: 40/98 MS: 1 ChangeBinInt- 00:08:27.331 [2024-07-23 05:07:58.298082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.331 [2024-07-23 05:07:58.298114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.331 [2024-07-23 05:07:58.298181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.331 [2024-07-23 05:07:58.298201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.331 #24 NEW cov: 11763 ft: 13598 corp: 11/532b lim: 100 exec/s: 0 rss: 68Mb L: 49/98 MS: 1 ShuffleBytes- 00:08:27.331 [2024-07-23 05:07:58.348125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.331 [2024-07-23 05:07:58.348156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.331 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:27.331 #25 NEW cov: 11786 ft: 13633 corp: 12/571b lim: 100 exec/s: 0 rss: 68Mb L: 39/98 MS: 1 ChangeBinInt- 00:08:27.331 [2024-07-23 05:07:58.398453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.331 [2024-07-23 05:07:58.398487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.331 [2024-07-23 05:07:58.398528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.331 [2024-07-23 05:07:58.398547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.590 #26 NEW cov: 11786 ft: 13687 corp: 13/620b lim: 100 exec/s: 0 rss: 68Mb L: 49/98 MS: 1 ChangeBit- 00:08:27.590 [2024-07-23 05:07:58.438534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.590 [2024-07-23 05:07:58.438566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.590 [2024-07-23 05:07:58.438608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.590 [2024-07-23 05:07:58.438627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.590 #27 NEW cov: 11786 ft: 13706 corp: 14/673b lim: 100 exec/s: 27 rss: 68Mb L: 53/98 MS: 1 CrossOver- 00:08:27.590 [2024-07-23 05:07:58.488556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.590 [2024-07-23 05:07:58.488589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.590 #28 NEW cov: 11786 ft: 13785 corp: 15/712b lim: 100 exec/s: 28 rss: 68Mb L: 39/98 MS: 1 ChangeBinInt- 00:08:27.590 [2024-07-23 05:07:58.538787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.590 [2024-07-23 05:07:58.538822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.590 [2024-07-23 05:07:58.538890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.590 [2024-07-23 05:07:58.538909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.590 #29 NEW cov: 11786 ft: 13807 corp: 16/760b lim: 100 exec/s: 29 rss: 68Mb L: 48/98 MS: 1 ShuffleBytes- 00:08:27.590 [2024-07-23 05:07:58.579172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.590 [2024-07-23 05:07:58.579204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.590 [2024-07-23 05:07:58.579262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.590 [2024-07-23 05:07:58.579278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.590 [2024-07-23 05:07:58.579337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:27.590 [2024-07-23 05:07:58.579355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.590 [2024-07-23 05:07:58.579414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:27.590 [2024-07-23 05:07:58.579432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.590 #30 NEW cov: 11786 ft: 13823 corp: 17/858b lim: 100 exec/s: 30 rss: 68Mb L: 98/98 MS: 1 ChangeBit- 00:08:27.590 [2024-07-23 05:07:58.629340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.590 [2024-07-23 05:07:58.629373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.590 [2024-07-23 05:07:58.629426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.590 [2024-07-23 05:07:58.629451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.590 [2024-07-23 05:07:58.629510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:27.590 [2024-07-23 05:07:58.629528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.590 [2024-07-23 05:07:58.629588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:27.590 [2024-07-23 05:07:58.629606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.590 #31 NEW cov: 11786 ft: 13875 corp: 18/956b lim: 100 exec/s: 31 rss: 69Mb L: 98/98 MS: 1 ChangeByte- 00:08:27.849 [2024-07-23 05:07:58.689257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.849 [2024-07-23 05:07:58.689290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.849 [2024-07-23 05:07:58.689332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.849 [2024-07-23 05:07:58.689351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.849 #32 NEW cov: 11786 ft: 13909 corp: 19/996b lim: 100 exec/s: 32 rss: 69Mb L: 40/98 MS: 1 ChangeBit- 00:08:27.850 [2024-07-23 05:07:58.729570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.850 [2024-07-23 05:07:58.729603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.850 [2024-07-23 05:07:58.729644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.850 [2024-07-23 05:07:58.729662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.850 [2024-07-23 05:07:58.729723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:27.850 [2024-07-23 05:07:58.729741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.850 [2024-07-23 05:07:58.729800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:27.850 [2024-07-23 05:07:58.729817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.850 #33 NEW cov: 11786 ft: 13922 corp: 20/1095b lim: 100 exec/s: 33 rss: 69Mb L: 99/99 MS: 1 InsertByte- 00:08:27.850 [2024-07-23 05:07:58.779513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.850 [2024-07-23 05:07:58.779545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.850 [2024-07-23 05:07:58.779585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.850 [2024-07-23 05:07:58.779604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.850 #34 NEW cov: 11786 ft: 13944 corp: 21/1148b lim: 100 exec/s: 34 rss: 69Mb L: 53/99 MS: 1 ChangeBinInt- 00:08:27.850 [2024-07-23 05:07:58.829538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.850 [2024-07-23 05:07:58.829570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.850 #35 NEW cov: 11786 ft: 13959 corp: 22/1187b lim: 100 exec/s: 35 rss: 69Mb L: 39/99 MS: 1 ChangeBit- 00:08:27.850 [2024-07-23 05:07:58.869748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.850 [2024-07-23 05:07:58.869781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.850 [2024-07-23 05:07:58.869823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.850 [2024-07-23 05:07:58.869841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.850 #36 NEW cov: 11786 ft: 13970 corp: 23/1240b lim: 100 exec/s: 36 rss: 69Mb L: 53/99 MS: 1 CrossOver- 00:08:27.850 [2024-07-23 05:07:58.909927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:27.850 [2024-07-23 05:07:58.909958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.850 [2024-07-23 05:07:58.909996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:27.850 [2024-07-23 05:07:58.910014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.850 #37 NEW cov: 11786 ft: 13983 corp: 24/1289b lim: 100 exec/s: 37 rss: 69Mb L: 49/99 MS: 1 ShuffleBytes- 00:08:28.109 [2024-07-23 05:07:58.949982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.109 [2024-07-23 05:07:58.950014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.109 [2024-07-23 05:07:58.950054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.109 [2024-07-23 05:07:58.950072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.109 #38 NEW cov: 11786 ft: 14015 corp: 25/1331b lim: 100 exec/s: 38 rss: 69Mb L: 42/99 MS: 1 EraseBytes- 00:08:28.109 [2024-07-23 05:07:59.000136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.109 [2024-07-23 05:07:59.000168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.109 [2024-07-23 05:07:59.000216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.109 [2024-07-23 05:07:59.000235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.109 #39 NEW cov: 11786 ft: 14017 corp: 26/1384b lim: 100 exec/s: 39 rss: 69Mb L: 53/99 MS: 1 ChangeBinInt- 00:08:28.109 [2024-07-23 05:07:59.040609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.109 [2024-07-23 05:07:59.040641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.109 [2024-07-23 05:07:59.040705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.109 [2024-07-23 05:07:59.040723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.109 [2024-07-23 05:07:59.040782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:28.109 [2024-07-23 05:07:59.040801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.109 [2024-07-23 05:07:59.040861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:28.109 [2024-07-23 05:07:59.040883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.109 [2024-07-23 05:07:59.040939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:28.109 [2024-07-23 05:07:59.040957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.109 #40 NEW cov: 11786 ft: 14056 corp: 27/1484b lim: 100 exec/s: 40 rss: 69Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:08:28.109 [2024-07-23 05:07:59.100355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.109 [2024-07-23 05:07:59.100387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.109 #41 NEW cov: 11786 ft: 14097 corp: 28/1504b lim: 100 exec/s: 41 rss: 69Mb L: 20/100 MS: 1 EraseBytes- 00:08:28.109 [2024-07-23 05:07:59.150564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.109 [2024-07-23 05:07:59.150595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.109 [2024-07-23 05:07:59.150643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.109 [2024-07-23 05:07:59.150661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.109 #42 NEW cov: 11786 ft: 14176 corp: 29/1553b lim: 100 exec/s: 42 rss: 69Mb L: 49/100 MS: 1 ChangeByte- 00:08:28.368 [2024-07-23 05:07:59.200727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.368 [2024-07-23 05:07:59.200758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.368 [2024-07-23 05:07:59.200804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.368 [2024-07-23 05:07:59.200823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.368 [2024-07-23 05:07:59.240798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.368 [2024-07-23 05:07:59.240829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.368 [2024-07-23 05:07:59.240875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.368 [2024-07-23 05:07:59.240895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.368 #44 NEW cov: 11786 ft: 14216 corp: 30/1602b lim: 100 exec/s: 44 rss: 69Mb L: 49/100 MS: 2 CMP-ChangeBit- DE: "\000\000\000\000\002 \217S"- 00:08:28.368 [2024-07-23 05:07:59.280972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.368 [2024-07-23 05:07:59.281005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.368 [2024-07-23 05:07:59.281048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.368 [2024-07-23 05:07:59.281067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.368 #45 NEW cov: 11786 ft: 14238 corp: 31/1642b lim: 100 exec/s: 45 rss: 69Mb L: 40/100 MS: 1 ChangeBinInt- 00:08:28.368 [2024-07-23 05:07:59.320894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.368 [2024-07-23 05:07:59.320925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.368 #46 NEW cov: 11786 ft: 14249 corp: 32/1662b lim: 100 exec/s: 46 rss: 70Mb L: 20/100 MS: 1 ShuffleBytes- 00:08:28.368 [2024-07-23 05:07:59.371476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.368 [2024-07-23 05:07:59.371513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.368 [2024-07-23 05:07:59.371556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.368 [2024-07-23 05:07:59.371574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.368 [2024-07-23 05:07:59.371634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:28.368 [2024-07-23 05:07:59.371651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.368 [2024-07-23 05:07:59.371712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:28.368 [2024-07-23 05:07:59.371729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.368 #47 NEW cov: 11786 ft: 14256 corp: 33/1760b lim: 100 exec/s: 47 rss: 70Mb L: 98/100 MS: 1 ChangeByte- 00:08:28.368 [2024-07-23 05:07:59.411612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:28.368 [2024-07-23 05:07:59.411643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.368 [2024-07-23 05:07:59.411699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:28.368 [2024-07-23 05:07:59.411716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.368 [2024-07-23 05:07:59.411773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:28.368 [2024-07-23 05:07:59.411791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.368 [2024-07-23 05:07:59.411849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:28.368 [2024-07-23 05:07:59.411867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.368 #48 NEW cov: 11786 ft: 14266 corp: 34/1858b lim: 100 exec/s: 24 rss: 70Mb L: 98/100 MS: 1 ChangeBinInt- 00:08:28.368 #48 DONE cov: 11786 ft: 14266 corp: 34/1858b lim: 100 exec/s: 24 rss: 70Mb 00:08:28.368 ###### Recommended dictionary. ###### 00:08:28.368 "\000\000\000\000\002 \217S" # Uses: 0 00:08:28.368 ###### End of recommended dictionary. ###### 00:08:28.368 Done 48 runs in 2 second(s) 00:08:28.627 05:07:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:28.627 05:07:59 -- ../common.sh@72 -- # (( i++ )) 00:08:28.627 05:07:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.627 05:07:59 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:28.627 05:07:59 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:28.627 05:07:59 -- nvmf/run.sh@24 -- # local timen=1 00:08:28.627 05:07:59 -- nvmf/run.sh@25 -- # local core=0x1 00:08:28.627 05:07:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:28.627 05:07:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:28.627 05:07:59 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:28.627 05:07:59 -- nvmf/run.sh@29 -- # port=4419 00:08:28.627 05:07:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:28.627 05:07:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:28.627 05:07:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:28.627 05:07:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:28.627 [2024-07-23 05:07:59.631238] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:28.627 [2024-07-23 05:07:59.631311] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3147575 ] 00:08:28.628 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.886 [2024-07-23 05:07:59.849707] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.886 [2024-07-23 05:07:59.925238] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:28.887 [2024-07-23 05:07:59.925413] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.145 [2024-07-23 05:07:59.986387] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:29.145 [2024-07-23 05:08:00.002765] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:29.145 INFO: Running with entropic power schedule (0xFF, 100). 00:08:29.145 INFO: Seed: 571202110 00:08:29.145 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:29.145 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:29.145 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:29.145 INFO: A corpus is not provided, starting from an empty corpus 00:08:29.145 #2 INITED exec/s: 0 rss: 60Mb 00:08:29.145 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:29.145 This may also happen if the target rejected all inputs we tried so far 00:08:29.145 [2024-07-23 05:08:00.078778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:29.145 [2024-07-23 05:08:00.078822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.145 [2024-07-23 05:08:00.078859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:29.145 [2024-07-23 05:08:00.078886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.404 NEW_FUNC[1/669]: 0x4a1250 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:29.404 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:29.404 #12 NEW cov: 11532 ft: 11534 corp: 2/27b lim: 50 exec/s: 0 rss: 67Mb L: 26/26 MS: 5 InsertByte-ChangeBit-ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:29.662 [2024-07-23 05:08:00.520133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:29.662 [2024-07-23 05:08:00.520183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.662 [2024-07-23 05:08:00.520312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446742974197924092 len:1 00:08:29.662 [2024-07-23 05:08:00.520343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.662 NEW_FUNC[1/1]: 0x179c420 in nvme_tcp_read_data /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/nvme_tcp.h:412 00:08:29.662 #18 NEW cov: 11650 ft: 12133 corp: 3/53b lim: 50 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 ChangeBinInt- 00:08:29.662 [2024-07-23 05:08:00.580031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7016996765293437281 len:24930 00:08:29.662 [2024-07-23 05:08:00.580064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.662 [2024-07-23 05:08:00.580109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 00:08:29.663 [2024-07-23 05:08:00.580131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.663 #21 NEW cov: 11656 ft: 12462 corp: 4/75b lim: 50 exec/s: 0 rss: 67Mb L: 22/26 MS: 3 CopyPart-CrossOver-InsertRepeatedBytes- 00:08:29.663 [2024-07-23 05:08:00.619633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7016996765293437281 len:24930 00:08:29.663 [2024-07-23 05:08:00.619665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.663 #22 NEW cov: 11741 ft: 13071 corp: 5/90b lim: 50 exec/s: 0 rss: 67Mb L: 15/26 MS: 1 EraseBytes- 00:08:29.663 [2024-07-23 05:08:00.670101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:29.663 [2024-07-23 05:08:00.670132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.663 #33 NEW cov: 11741 ft: 13232 corp: 6/107b lim: 50 exec/s: 0 rss: 67Mb L: 17/26 MS: 1 EraseBytes- 00:08:29.663 [2024-07-23 05:08:00.720695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:29.663 [2024-07-23 05:08:00.720728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.663 [2024-07-23 05:08:00.720800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4991471924669662533 len:17734 00:08:29.663 [2024-07-23 05:08:00.720822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.663 [2024-07-23 05:08:00.720936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1162167552 len:1 00:08:29.663 [2024-07-23 05:08:00.720961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.663 #34 NEW cov: 11741 ft: 13590 corp: 7/145b lim: 50 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:29.922 [2024-07-23 05:08:00.770868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:29.922 [2024-07-23 05:08:00.770898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.922 [2024-07-23 05:08:00.770960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:29.922 [2024-07-23 05:08:00.770978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.922 [2024-07-23 05:08:00.771091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:29.922 [2024-07-23 05:08:00.771114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.922 #39 NEW cov: 11741 ft: 13714 corp: 8/183b lim: 50 exec/s: 0 rss: 67Mb L: 38/38 MS: 5 ChangeBit-ChangeBit-CopyPart-CMP-InsertRepeatedBytes- DE: "\037\000\000\000"- 00:08:29.922 [2024-07-23 05:08:00.821228] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:29.922 [2024-07-23 05:08:00.821258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.922 [2024-07-23 05:08:00.821322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:29.922 [2024-07-23 05:08:00.821340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.922 [2024-07-23 05:08:00.821453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:29.922 [2024-07-23 05:08:00.821481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.922 [2024-07-23 05:08:00.821604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2233785415175768329 len:23645 00:08:29.922 [2024-07-23 05:08:00.821630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.922 #40 NEW cov: 11741 ft: 14022 corp: 9/226b lim: 50 exec/s: 0 rss: 67Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:08:29.922 [2024-07-23 05:08:00.881219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:720575940899373056 len:1 00:08:29.922 [2024-07-23 05:08:00.881251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.922 [2024-07-23 05:08:00.881308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1082331758592 len:65291 00:08:29.922 [2024-07-23 05:08:00.881334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.922 [2024-07-23 05:08:00.881446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4294901760 len:1 00:08:29.923 [2024-07-23 05:08:00.881469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.923 #42 NEW cov: 11741 ft: 14125 corp: 10/257b lim: 50 exec/s: 0 rss: 67Mb L: 31/43 MS: 2 PersAutoDict-CrossOver- DE: "\037\000\000\000"- 00:08:29.923 [2024-07-23 05:08:00.931145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:29.923 [2024-07-23 05:08:00.931181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.923 [2024-07-23 05:08:00.931234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:133143986176 len:1 00:08:29.923 [2024-07-23 05:08:00.931258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.923 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:29.923 #43 NEW cov: 11764 ft: 14279 corp: 11/278b lim: 50 exec/s: 0 rss: 67Mb L: 21/43 MS: 1 EraseBytes- 00:08:29.923 [2024-07-23 05:08:00.971237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:16385 00:08:29.923 [2024-07-23 05:08:00.971269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.923 [2024-07-23 05:08:00.971325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:133143986176 len:1 00:08:29.923 [2024-07-23 05:08:00.971348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.923 #49 NEW cov: 11764 ft: 14311 corp: 12/299b lim: 50 exec/s: 0 rss: 68Mb L: 21/43 MS: 1 ChangeBit- 00:08:30.182 [2024-07-23 05:08:01.021644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:520101632 len:2561 00:08:30.182 [2024-07-23 05:08:01.021676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.182 [2024-07-23 05:08:01.021759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:30.182 [2024-07-23 05:08:01.021784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.182 [2024-07-23 05:08:01.021898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446462598749421322 len:1 00:08:30.182 [2024-07-23 05:08:01.021917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.182 #50 NEW cov: 11764 ft: 14366 corp: 13/334b lim: 50 exec/s: 50 rss: 68Mb L: 35/43 MS: 1 PersAutoDict- DE: "\037\000\000\000"- 00:08:30.182 [2024-07-23 05:08:01.071625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7016996765293437281 len:24930 00:08:30.182 [2024-07-23 05:08:01.071663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.182 [2024-07-23 05:08:01.071698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7016996765295075681 len:24930 00:08:30.182 [2024-07-23 05:08:01.071727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.182 #51 NEW cov: 11764 ft: 14407 corp: 14/356b lim: 50 exec/s: 51 rss: 68Mb L: 22/43 MS: 1 ChangeByte- 00:08:30.182 [2024-07-23 05:08:01.111921] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:30.182 [2024-07-23 05:08:01.111954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.182 [2024-07-23 05:08:01.112021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:19497937210262853 len:17734 00:08:30.182 [2024-07-23 05:08:01.112042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.182 [2024-07-23 05:08:01.112158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1162167552 len:1 00:08:30.182 [2024-07-23 05:08:01.112182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.182 #52 NEW cov: 11764 ft: 14436 corp: 15/394b lim: 50 exec/s: 52 rss: 68Mb L: 38/43 MS: 1 ShuffleBytes- 00:08:30.182 [2024-07-23 05:08:01.151840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2048 len:16385 00:08:30.182 [2024-07-23 05:08:01.151874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.182 [2024-07-23 05:08:01.151928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:133143986176 len:1 00:08:30.182 [2024-07-23 05:08:01.151953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.182 #53 NEW cov: 11764 ft: 14509 corp: 16/415b lim: 50 exec/s: 53 rss: 68Mb L: 21/43 MS: 1 ChangeBinInt- 00:08:30.182 [2024-07-23 05:08:01.202006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:30.182 [2024-07-23 05:08:01.202038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.182 [2024-07-23 05:08:01.202096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:30.182 [2024-07-23 05:08:01.202113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.183 #54 NEW cov: 11764 ft: 14531 corp: 17/436b lim: 50 exec/s: 54 rss: 68Mb L: 21/43 MS: 1 EraseBytes- 00:08:30.183 [2024-07-23 05:08:01.251954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:30.183 [2024-07-23 05:08:01.251986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.442 #55 NEW cov: 11764 ft: 14554 corp: 18/453b lim: 50 exec/s: 55 rss: 68Mb L: 17/43 MS: 1 PersAutoDict- DE: "\037\000\000\000"- 00:08:30.442 [2024-07-23 05:08:01.292187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7016996765293437227 len:24930 00:08:30.442 [2024-07-23 05:08:01.292218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.442 [2024-07-23 05:08:01.292258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7016996765293443681 len:24930 00:08:30.442 [2024-07-23 05:08:01.292292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.442 #56 NEW cov: 11764 ft: 14565 corp: 19/476b lim: 50 exec/s: 56 rss: 68Mb L: 23/43 MS: 1 InsertByte- 00:08:30.442 [2024-07-23 05:08:01.342576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:520101632 len:2561 00:08:30.442 [2024-07-23 05:08:01.342608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.442 [2024-07-23 05:08:01.342684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:133143986176 len:1 00:08:30.442 [2024-07-23 05:08:01.342705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.442 [2024-07-23 05:08:01.342818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:71212112545841152 len:65536 00:08:30.442 [2024-07-23 05:08:01.342840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.442 #57 NEW cov: 11764 ft: 14602 corp: 20/515b lim: 50 exec/s: 57 rss: 68Mb L: 39/43 MS: 1 PersAutoDict- DE: "\037\000\000\000"- 00:08:30.442 [2024-07-23 05:08:01.382906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:30.442 [2024-07-23 05:08:01.382940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.442 [2024-07-23 05:08:01.383018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:30.442 [2024-07-23 05:08:01.383041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.442 [2024-07-23 05:08:01.383163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:30.442 [2024-07-23 05:08:01.383187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.442 [2024-07-23 05:08:01.383298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:10 00:08:30.442 [2024-07-23 05:08:01.383320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.442 #58 NEW cov: 11764 ft: 14615 corp: 21/560b lim: 50 exec/s: 58 rss: 68Mb L: 45/45 MS: 1 CopyPart- 00:08:30.442 [2024-07-23 05:08:01.422655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:30.442 [2024-07-23 05:08:01.422686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.442 [2024-07-23 05:08:01.422744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:30.442 [2024-07-23 05:08:01.422768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.442 #64 NEW cov: 11764 ft: 14624 corp: 22/581b lim: 50 exec/s: 64 rss: 68Mb L: 21/45 MS: 1 CrossOver- 00:08:30.442 [2024-07-23 05:08:01.462593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:30.442 [2024-07-23 05:08:01.462623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.442 #65 NEW cov: 11764 ft: 14671 corp: 23/598b lim: 50 exec/s: 65 rss: 69Mb L: 17/45 MS: 1 ChangeBit- 00:08:30.442 [2024-07-23 05:08:01.503104] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:71212114173231104 len:65536 00:08:30.442 [2024-07-23 05:08:01.503139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.442 [2024-07-23 05:08:01.503197] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:107069239721984 len:24930 00:08:30.442 [2024-07-23 05:08:01.503219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.442 [2024-07-23 05:08:01.503334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 00:08:30.442 [2024-07-23 05:08:01.503361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.442 #66 NEW cov: 11764 ft: 14681 corp: 24/635b lim: 50 exec/s: 66 rss: 69Mb L: 37/45 MS: 1 CrossOver- 00:08:30.702 [2024-07-23 05:08:01.543017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7016996765293437281 len:24930 00:08:30.702 [2024-07-23 05:08:01.543050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.702 [2024-07-23 05:08:01.543130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 00:08:30.702 [2024-07-23 05:08:01.543149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.702 [2024-07-23 05:08:01.543256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15408456812550215125 len:54742 00:08:30.702 [2024-07-23 05:08:01.543276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.702 [2024-07-23 05:08:01.543374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15408456814510331349 len:54742 00:08:30.702 [2024-07-23 05:08:01.543395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.702 #67 NEW cov: 11764 ft: 14695 corp: 25/680b lim: 50 exec/s: 67 rss: 69Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:30.702 [2024-07-23 05:08:01.593593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:30.702 [2024-07-23 05:08:01.593626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.702 [2024-07-23 05:08:01.593703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:30.702 [2024-07-23 05:08:01.593725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.702 [2024-07-23 05:08:01.593845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:30.702 [2024-07-23 05:08:01.593867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.702 [2024-07-23 05:08:01.593985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2304 len:1 00:08:30.702 [2024-07-23 05:08:01.594004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.702 #68 NEW cov: 11764 ft: 14818 corp: 26/726b lim: 50 exec/s: 68 rss: 69Mb L: 46/46 MS: 1 InsertRepeatedBytes- 00:08:30.702 [2024-07-23 05:08:01.643786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:30.702 [2024-07-23 05:08:01.643821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.702 [2024-07-23 05:08:01.643902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2415919104 len:1 00:08:30.702 [2024-07-23 05:08:01.643928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.702 [2024-07-23 05:08:01.644043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:30.702 [2024-07-23 05:08:01.644068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.702 [2024-07-23 05:08:01.644179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:10 00:08:30.702 [2024-07-23 05:08:01.644203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.702 #69 NEW cov: 11764 ft: 14831 corp: 27/771b lim: 50 exec/s: 69 rss: 69Mb L: 45/46 MS: 1 ChangeByte- 00:08:30.702 [2024-07-23 05:08:01.703419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:30.702 [2024-07-23 05:08:01.703450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.702 #70 NEW cov: 11764 ft: 14864 corp: 28/788b lim: 50 exec/s: 70 rss: 69Mb L: 17/46 MS: 1 PersAutoDict- DE: "\037\000\000\000"- 00:08:30.702 [2024-07-23 05:08:01.753729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:30.702 [2024-07-23 05:08:01.753764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.702 [2024-07-23 05:08:01.753809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:167772160 len:1 00:08:30.702 [2024-07-23 05:08:01.753833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.702 #71 NEW cov: 11764 ft: 14902 corp: 29/810b lim: 50 exec/s: 71 rss: 69Mb L: 22/46 MS: 1 CrossOver- 00:08:30.961 [2024-07-23 05:08:01.803945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7016996765293437227 len:24930 00:08:30.961 [2024-07-23 05:08:01.803978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.961 [2024-07-23 05:08:01.804015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7016996765293443681 len:24930 00:08:30.961 [2024-07-23 05:08:01.804038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.961 #72 NEW cov: 11764 ft: 14945 corp: 30/833b lim: 50 exec/s: 72 rss: 69Mb L: 23/46 MS: 1 ShuffleBytes- 00:08:30.961 [2024-07-23 05:08:01.864493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:30.961 [2024-07-23 05:08:01.864526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.961 [2024-07-23 05:08:01.864626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:30.961 [2024-07-23 05:08:01.864641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.961 [2024-07-23 05:08:01.864758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:30.961 [2024-07-23 05:08:01.864784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.961 [2024-07-23 05:08:01.864898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2304 len:1 00:08:30.961 [2024-07-23 05:08:01.864923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.962 #73 NEW cov: 11764 ft: 14965 corp: 31/879b lim: 50 exec/s: 73 rss: 69Mb L: 46/46 MS: 1 ChangeBinInt- 00:08:30.962 [2024-07-23 05:08:01.924087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:30.962 [2024-07-23 05:08:01.924120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.962 #74 NEW cov: 11764 ft: 15015 corp: 32/895b lim: 50 exec/s: 74 rss: 69Mb L: 16/46 MS: 1 EraseBytes- 00:08:30.962 [2024-07-23 05:08:01.984630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:30.962 [2024-07-23 05:08:01.984663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.962 [2024-07-23 05:08:01.984733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4991471924669662533 len:17734 00:08:30.962 [2024-07-23 05:08:01.984752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.962 [2024-07-23 05:08:01.984875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:17593348211968 len:1 00:08:30.962 [2024-07-23 05:08:01.984901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.962 #75 NEW cov: 11764 ft: 15026 corp: 33/933b lim: 50 exec/s: 75 rss: 69Mb L: 38/46 MS: 1 CMP- DE: "\020\000\000\000\000\000\000\000"- 00:08:30.962 [2024-07-23 05:08:02.034711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:30.962 [2024-07-23 05:08:02.034748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.962 [2024-07-23 05:08:02.034820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2031616 len:1 00:08:30.962 [2024-07-23 05:08:02.034842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.221 #76 NEW cov: 11764 ft: 15048 corp: 34/956b lim: 50 exec/s: 38 rss: 69Mb L: 23/46 MS: 1 InsertRepeatedBytes- 00:08:31.221 #76 DONE cov: 11764 ft: 15048 corp: 34/956b lim: 50 exec/s: 38 rss: 69Mb 00:08:31.221 ###### Recommended dictionary. ###### 00:08:31.221 "\037\000\000\000" # Uses: 5 00:08:31.221 "\020\000\000\000\000\000\000\000" # Uses: 0 00:08:31.221 ###### End of recommended dictionary. ###### 00:08:31.221 Done 76 runs in 2 second(s) 00:08:31.221 05:08:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:31.221 05:08:02 -- ../common.sh@72 -- # (( i++ )) 00:08:31.221 05:08:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:31.221 05:08:02 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:31.221 05:08:02 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:31.221 05:08:02 -- nvmf/run.sh@24 -- # local timen=1 00:08:31.221 05:08:02 -- nvmf/run.sh@25 -- # local core=0x1 00:08:31.221 05:08:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:31.221 05:08:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:31.221 05:08:02 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:31.221 05:08:02 -- nvmf/run.sh@29 -- # port=4420 00:08:31.221 05:08:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:31.221 05:08:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:31.221 05:08:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:31.221 05:08:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:31.221 [2024-07-23 05:08:02.240118] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:31.221 [2024-07-23 05:08:02.240190] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3148119 ] 00:08:31.221 EAL: No free 2048 kB hugepages reported on node 1 00:08:31.479 [2024-07-23 05:08:02.453970] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.479 [2024-07-23 05:08:02.530299] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:31.480 [2024-07-23 05:08:02.530476] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.739 [2024-07-23 05:08:02.591567] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:31.739 [2024-07-23 05:08:02.607883] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:31.739 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.739 INFO: Seed: 3176192317 00:08:31.739 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:31.739 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:31.739 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:31.739 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.739 #2 INITED exec/s: 0 rss: 60Mb 00:08:31.739 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.739 This may also happen if the target rejected all inputs we tried so far 00:08:31.739 [2024-07-23 05:08:02.673751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.739 [2024-07-23 05:08:02.673788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.739 [2024-07-23 05:08:02.673845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.739 [2024-07-23 05:08:02.673866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.739 [2024-07-23 05:08:02.673928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:31.739 [2024-07-23 05:08:02.673947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.739 [2024-07-23 05:08:02.674009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:31.739 [2024-07-23 05:08:02.674028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.998 NEW_FUNC[1/672]: 0x4a2e10 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:31.998 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:31.998 #18 NEW cov: 11595 ft: 11593 corp: 2/90b lim: 90 exec/s: 0 rss: 67Mb L: 89/89 MS: 1 InsertRepeatedBytes- 00:08:32.258 [2024-07-23 05:08:03.104346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.258 [2024-07-23 05:08:03.104406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.258 #19 NEW cov: 11708 ft: 13016 corp: 3/117b lim: 90 exec/s: 0 rss: 67Mb L: 27/89 MS: 1 CrossOver- 00:08:32.258 [2024-07-23 05:08:03.154791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.258 [2024-07-23 05:08:03.154828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.258 [2024-07-23 05:08:03.154866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.258 [2024-07-23 05:08:03.154897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.258 [2024-07-23 05:08:03.154958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.258 [2024-07-23 05:08:03.154982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.258 [2024-07-23 05:08:03.155046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:32.258 [2024-07-23 05:08:03.155067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.258 #20 NEW cov: 11714 ft: 13297 corp: 4/206b lim: 90 exec/s: 0 rss: 67Mb L: 89/89 MS: 1 ChangeByte- 00:08:32.258 [2024-07-23 05:08:03.204617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.258 [2024-07-23 05:08:03.204653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.258 [2024-07-23 05:08:03.204708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.258 [2024-07-23 05:08:03.204728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.258 #24 NEW cov: 11799 ft: 13932 corp: 5/242b lim: 90 exec/s: 0 rss: 67Mb L: 36/89 MS: 4 CopyPart-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:32.258 [2024-07-23 05:08:03.254624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.258 [2024-07-23 05:08:03.254658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.258 #26 NEW cov: 11799 ft: 13998 corp: 6/262b lim: 90 exec/s: 0 rss: 67Mb L: 20/89 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:32.258 [2024-07-23 05:08:03.294750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.258 [2024-07-23 05:08:03.294784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.258 #27 NEW cov: 11799 ft: 14134 corp: 7/290b lim: 90 exec/s: 0 rss: 67Mb L: 28/89 MS: 1 InsertByte- 00:08:32.517 [2024-07-23 05:08:03.355040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.517 [2024-07-23 05:08:03.355077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.517 [2024-07-23 05:08:03.355132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.517 [2024-07-23 05:08:03.355154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.517 #28 NEW cov: 11799 ft: 14273 corp: 8/326b lim: 90 exec/s: 0 rss: 67Mb L: 36/89 MS: 1 ChangeBit- 00:08:32.517 [2024-07-23 05:08:03.415109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.517 [2024-07-23 05:08:03.415145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.517 #33 NEW cov: 11799 ft: 14293 corp: 9/346b lim: 90 exec/s: 0 rss: 67Mb L: 20/89 MS: 5 ShuffleBytes-CMP-ChangeBit-EraseBytes-InsertRepeatedBytes- DE: "\377\377\377\377\377\377\377\377"- 00:08:32.517 [2024-07-23 05:08:03.455700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.517 [2024-07-23 05:08:03.455736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.517 [2024-07-23 05:08:03.455784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.517 [2024-07-23 05:08:03.455804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.517 [2024-07-23 05:08:03.455867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.517 [2024-07-23 05:08:03.455892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.517 [2024-07-23 05:08:03.455955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:32.517 [2024-07-23 05:08:03.455975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.517 #34 NEW cov: 11799 ft: 14316 corp: 10/429b lim: 90 exec/s: 0 rss: 68Mb L: 83/89 MS: 1 EraseBytes- 00:08:32.517 [2024-07-23 05:08:03.505362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.517 [2024-07-23 05:08:03.505397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.517 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:32.517 #35 NEW cov: 11822 ft: 14347 corp: 11/456b lim: 90 exec/s: 0 rss: 68Mb L: 27/89 MS: 1 CopyPart- 00:08:32.517 [2024-07-23 05:08:03.555514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.517 [2024-07-23 05:08:03.555548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.517 #36 NEW cov: 11822 ft: 14382 corp: 12/483b lim: 90 exec/s: 0 rss: 68Mb L: 27/89 MS: 1 ShuffleBytes- 00:08:32.776 [2024-07-23 05:08:03.615679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.776 [2024-07-23 05:08:03.615715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.776 #37 NEW cov: 11822 ft: 14429 corp: 13/511b lim: 90 exec/s: 37 rss: 68Mb L: 28/89 MS: 1 ChangeByte- 00:08:32.776 [2024-07-23 05:08:03.676515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.776 [2024-07-23 05:08:03.676549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.776 [2024-07-23 05:08:03.676607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:32.776 [2024-07-23 05:08:03.676626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.776 [2024-07-23 05:08:03.676684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:32.777 [2024-07-23 05:08:03.676704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.777 [2024-07-23 05:08:03.676761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:32.777 [2024-07-23 05:08:03.676782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.777 [2024-07-23 05:08:03.676843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:32.777 [2024-07-23 05:08:03.676864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:32.777 #38 NEW cov: 11822 ft: 14493 corp: 14/601b lim: 90 exec/s: 38 rss: 68Mb L: 90/90 MS: 1 InsertByte- 00:08:32.777 [2024-07-23 05:08:03.725977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.777 [2024-07-23 05:08:03.726011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.777 #39 NEW cov: 11822 ft: 14500 corp: 15/629b lim: 90 exec/s: 39 rss: 68Mb L: 28/90 MS: 1 ChangeByte- 00:08:32.777 [2024-07-23 05:08:03.786155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.777 [2024-07-23 05:08:03.786189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.777 #40 NEW cov: 11822 ft: 14527 corp: 16/657b lim: 90 exec/s: 40 rss: 68Mb L: 28/90 MS: 1 ShuffleBytes- 00:08:32.777 [2024-07-23 05:08:03.846366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:32.777 [2024-07-23 05:08:03.846401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.036 #41 NEW cov: 11822 ft: 14551 corp: 17/678b lim: 90 exec/s: 41 rss: 68Mb L: 21/90 MS: 1 InsertByte- 00:08:33.036 [2024-07-23 05:08:03.906805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.036 [2024-07-23 05:08:03.906840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.036 [2024-07-23 05:08:03.906888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.036 [2024-07-23 05:08:03.906908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.036 [2024-07-23 05:08:03.906971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.036 [2024-07-23 05:08:03.906989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.036 #42 NEW cov: 11822 ft: 14841 corp: 18/733b lim: 90 exec/s: 42 rss: 68Mb L: 55/90 MS: 1 CrossOver- 00:08:33.036 [2024-07-23 05:08:03.956640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.036 [2024-07-23 05:08:03.956674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.036 #43 NEW cov: 11822 ft: 14846 corp: 19/768b lim: 90 exec/s: 43 rss: 68Mb L: 35/90 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:33.036 [2024-07-23 05:08:04.016811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.036 [2024-07-23 05:08:04.016845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.036 #44 NEW cov: 11822 ft: 14848 corp: 20/797b lim: 90 exec/s: 44 rss: 69Mb L: 29/90 MS: 1 InsertByte- 00:08:33.036 [2024-07-23 05:08:04.076966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.036 [2024-07-23 05:08:04.076999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.036 #45 NEW cov: 11822 ft: 14861 corp: 21/824b lim: 90 exec/s: 45 rss: 69Mb L: 27/90 MS: 1 ShuffleBytes- 00:08:33.036 [2024-07-23 05:08:04.117814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.036 [2024-07-23 05:08:04.117848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.036 [2024-07-23 05:08:04.117896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.036 [2024-07-23 05:08:04.117916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.036 [2024-07-23 05:08:04.117978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.036 [2024-07-23 05:08:04.117998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.036 [2024-07-23 05:08:04.118061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:33.036 [2024-07-23 05:08:04.118082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.296 #46 NEW cov: 11822 ft: 14907 corp: 22/913b lim: 90 exec/s: 46 rss: 69Mb L: 89/90 MS: 1 ChangeBit- 00:08:33.296 [2024-07-23 05:08:04.157203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.296 [2024-07-23 05:08:04.157241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.296 #47 NEW cov: 11822 ft: 14948 corp: 23/948b lim: 90 exec/s: 47 rss: 69Mb L: 35/90 MS: 1 CopyPart- 00:08:33.296 [2024-07-23 05:08:04.217412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.296 [2024-07-23 05:08:04.217451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.296 #53 NEW cov: 11822 ft: 14953 corp: 24/975b lim: 90 exec/s: 53 rss: 69Mb L: 27/90 MS: 1 CMP- DE: "\0010\343[\311\212SF"- 00:08:33.296 [2024-07-23 05:08:04.258146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.296 [2024-07-23 05:08:04.258180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.296 [2024-07-23 05:08:04.258240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.296 [2024-07-23 05:08:04.258259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.296 [2024-07-23 05:08:04.258323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.296 [2024-07-23 05:08:04.258343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.296 [2024-07-23 05:08:04.258402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:33.296 [2024-07-23 05:08:04.258422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.296 [2024-07-23 05:08:04.258491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:33.296 [2024-07-23 05:08:04.258513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:33.296 #54 NEW cov: 11822 ft: 14974 corp: 25/1065b lim: 90 exec/s: 54 rss: 69Mb L: 90/90 MS: 1 ShuffleBytes- 00:08:33.296 [2024-07-23 05:08:04.318180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.296 [2024-07-23 05:08:04.318214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.296 [2024-07-23 05:08:04.318257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.296 [2024-07-23 05:08:04.318280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.296 [2024-07-23 05:08:04.318340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.296 [2024-07-23 05:08:04.318360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.296 [2024-07-23 05:08:04.318420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:33.296 [2024-07-23 05:08:04.318440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.296 #55 NEW cov: 11822 ft: 15036 corp: 26/1154b lim: 90 exec/s: 55 rss: 69Mb L: 89/90 MS: 1 ChangeBit- 00:08:33.296 [2024-07-23 05:08:04.358262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.296 [2024-07-23 05:08:04.358297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.296 [2024-07-23 05:08:04.358349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.296 [2024-07-23 05:08:04.358369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.296 [2024-07-23 05:08:04.358434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.296 [2024-07-23 05:08:04.358459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.296 [2024-07-23 05:08:04.358524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:33.296 [2024-07-23 05:08:04.358544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.555 #56 NEW cov: 11822 ft: 15052 corp: 27/1237b lim: 90 exec/s: 56 rss: 69Mb L: 83/90 MS: 1 PersAutoDict- DE: "\0010\343[\311\212SF"- 00:08:33.556 [2024-07-23 05:08:04.408419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.556 [2024-07-23 05:08:04.408458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.556 [2024-07-23 05:08:04.408516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.556 [2024-07-23 05:08:04.408542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.556 [2024-07-23 05:08:04.408604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.556 [2024-07-23 05:08:04.408625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.556 [2024-07-23 05:08:04.408690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:33.556 [2024-07-23 05:08:04.408709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.556 #57 NEW cov: 11822 ft: 15067 corp: 28/1326b lim: 90 exec/s: 57 rss: 69Mb L: 89/90 MS: 1 ChangeByte- 00:08:33.556 [2024-07-23 05:08:04.448100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.556 [2024-07-23 05:08:04.448134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.556 #58 NEW cov: 11822 ft: 15089 corp: 29/1353b lim: 90 exec/s: 58 rss: 69Mb L: 27/90 MS: 1 ChangeByte- 00:08:33.556 [2024-07-23 05:08:04.488196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.556 [2024-07-23 05:08:04.488230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.556 #59 NEW cov: 11822 ft: 15113 corp: 30/1381b lim: 90 exec/s: 59 rss: 69Mb L: 28/90 MS: 1 InsertByte- 00:08:33.556 [2024-07-23 05:08:04.538808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.556 [2024-07-23 05:08:04.538841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.556 [2024-07-23 05:08:04.538899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.556 [2024-07-23 05:08:04.538919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.556 [2024-07-23 05:08:04.538981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:33.556 [2024-07-23 05:08:04.539001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.556 [2024-07-23 05:08:04.539060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:33.556 [2024-07-23 05:08:04.539081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.556 #60 NEW cov: 11822 ft: 15285 corp: 31/1457b lim: 90 exec/s: 60 rss: 69Mb L: 76/90 MS: 1 InsertRepeatedBytes- 00:08:33.556 [2024-07-23 05:08:04.578608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.556 [2024-07-23 05:08:04.578642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.556 [2024-07-23 05:08:04.578692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:33.556 [2024-07-23 05:08:04.578710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.556 #61 NEW cov: 11822 ft: 15296 corp: 32/1498b lim: 90 exec/s: 61 rss: 69Mb L: 41/90 MS: 1 CopyPart- 00:08:33.556 [2024-07-23 05:08:04.628657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:33.556 [2024-07-23 05:08:04.628690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.815 #62 NEW cov: 11822 ft: 15317 corp: 33/1525b lim: 90 exec/s: 31 rss: 69Mb L: 27/90 MS: 1 EraseBytes- 00:08:33.815 #62 DONE cov: 11822 ft: 15317 corp: 33/1525b lim: 90 exec/s: 31 rss: 69Mb 00:08:33.816 ###### Recommended dictionary. ###### 00:08:33.816 "\377\377\377\377\377\377\377\377" # Uses: 1 00:08:33.816 "\0010\343[\311\212SF" # Uses: 1 00:08:33.816 ###### End of recommended dictionary. ###### 00:08:33.816 Done 62 runs in 2 second(s) 00:08:33.816 05:08:04 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:33.816 05:08:04 -- ../common.sh@72 -- # (( i++ )) 00:08:33.816 05:08:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:33.816 05:08:04 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:33.816 05:08:04 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:33.816 05:08:04 -- nvmf/run.sh@24 -- # local timen=1 00:08:33.816 05:08:04 -- nvmf/run.sh@25 -- # local core=0x1 00:08:33.816 05:08:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:33.816 05:08:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:33.816 05:08:04 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:33.816 05:08:04 -- nvmf/run.sh@29 -- # port=4421 00:08:33.816 05:08:04 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:33.816 05:08:04 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:33.816 05:08:04 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:33.816 05:08:04 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:33.816 [2024-07-23 05:08:04.839359] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:33.816 [2024-07-23 05:08:04.839428] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3148466 ] 00:08:33.816 EAL: No free 2048 kB hugepages reported on node 1 00:08:34.075 [2024-07-23 05:08:05.057335] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.075 [2024-07-23 05:08:05.133166] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:34.075 [2024-07-23 05:08:05.133339] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.334 [2024-07-23 05:08:05.194495] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:34.334 [2024-07-23 05:08:05.210856] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:34.334 INFO: Running with entropic power schedule (0xFF, 100). 00:08:34.334 INFO: Seed: 1483234019 00:08:34.334 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:34.334 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:34.334 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:34.334 INFO: A corpus is not provided, starting from an empty corpus 00:08:34.334 #2 INITED exec/s: 0 rss: 60Mb 00:08:34.334 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:34.334 This may also happen if the target rejected all inputs we tried so far 00:08:34.335 [2024-07-23 05:08:05.266315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:34.335 [2024-07-23 05:08:05.266353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.335 [2024-07-23 05:08:05.266419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:34.335 [2024-07-23 05:08:05.266440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.272 NEW_FUNC[1/672]: 0x4a6030 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:35.272 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:35.272 #16 NEW cov: 11570 ft: 11565 corp: 2/23b lim: 50 exec/s: 0 rss: 67Mb L: 22/22 MS: 4 CopyPart-InsertByte-InsertByte-InsertRepeatedBytes- 00:08:35.272 [2024-07-23 05:08:06.228955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.272 [2024-07-23 05:08:06.229000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.272 [2024-07-23 05:08:06.229061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.272 [2024-07-23 05:08:06.229081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.272 [2024-07-23 05:08:06.229147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.272 [2024-07-23 05:08:06.229166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.272 #17 NEW cov: 11683 ft: 12276 corp: 3/61b lim: 50 exec/s: 17 rss: 67Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:35.272 [2024-07-23 05:08:06.289053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.272 [2024-07-23 05:08:06.289089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.272 [2024-07-23 05:08:06.289136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.272 [2024-07-23 05:08:06.289158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.272 [2024-07-23 05:08:06.289221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.272 [2024-07-23 05:08:06.289241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.272 #18 NEW cov: 11689 ft: 12406 corp: 4/99b lim: 50 exec/s: 18 rss: 67Mb L: 38/38 MS: 1 ChangeBit- 00:08:35.272 [2024-07-23 05:08:06.339474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.272 [2024-07-23 05:08:06.339509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.272 [2024-07-23 05:08:06.339566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.272 [2024-07-23 05:08:06.339588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.272 [2024-07-23 05:08:06.339653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.272 [2024-07-23 05:08:06.339678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.272 [2024-07-23 05:08:06.339742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:35.272 [2024-07-23 05:08:06.339763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.272 [2024-07-23 05:08:06.339829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:35.272 [2024-07-23 05:08:06.339850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.532 #19 NEW cov: 11774 ft: 13086 corp: 5/149b lim: 50 exec/s: 19 rss: 67Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:35.532 [2024-07-23 05:08:06.389465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.532 [2024-07-23 05:08:06.389499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.532 [2024-07-23 05:08:06.389559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.532 [2024-07-23 05:08:06.389581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.532 [2024-07-23 05:08:06.389644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.532 [2024-07-23 05:08:06.389664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.532 [2024-07-23 05:08:06.389730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:35.532 [2024-07-23 05:08:06.389750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.532 #20 NEW cov: 11774 ft: 13315 corp: 6/189b lim: 50 exec/s: 20 rss: 67Mb L: 40/50 MS: 1 CrossOver- 00:08:35.532 [2024-07-23 05:08:06.439245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.532 [2024-07-23 05:08:06.439280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.532 [2024-07-23 05:08:06.439339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.532 [2024-07-23 05:08:06.439361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.532 #21 NEW cov: 11774 ft: 13427 corp: 7/212b lim: 50 exec/s: 21 rss: 67Mb L: 23/50 MS: 1 InsertByte- 00:08:35.532 [2024-07-23 05:08:06.489594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.532 [2024-07-23 05:08:06.489628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.532 [2024-07-23 05:08:06.489669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.532 [2024-07-23 05:08:06.489694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.532 [2024-07-23 05:08:06.489760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.532 [2024-07-23 05:08:06.489782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.532 #22 NEW cov: 11774 ft: 13479 corp: 8/249b lim: 50 exec/s: 22 rss: 67Mb L: 37/50 MS: 1 EraseBytes- 00:08:35.532 [2024-07-23 05:08:06.550131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.532 [2024-07-23 05:08:06.550168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.532 [2024-07-23 05:08:06.550214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.532 [2024-07-23 05:08:06.550235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.532 [2024-07-23 05:08:06.550298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.532 [2024-07-23 05:08:06.550320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.532 [2024-07-23 05:08:06.550386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:35.532 [2024-07-23 05:08:06.550407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.532 [2024-07-23 05:08:06.550474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:35.532 [2024-07-23 05:08:06.550496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.532 #23 NEW cov: 11774 ft: 13519 corp: 9/299b lim: 50 exec/s: 23 rss: 68Mb L: 50/50 MS: 1 ChangeBit- 00:08:35.532 [2024-07-23 05:08:06.610080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.532 [2024-07-23 05:08:06.610115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.532 [2024-07-23 05:08:06.610167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.532 [2024-07-23 05:08:06.610188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.533 [2024-07-23 05:08:06.610251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.533 [2024-07-23 05:08:06.610272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.533 [2024-07-23 05:08:06.610337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:35.533 [2024-07-23 05:08:06.610359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.792 #24 NEW cov: 11774 ft: 13570 corp: 10/346b lim: 50 exec/s: 24 rss: 68Mb L: 47/50 MS: 1 InsertRepeatedBytes- 00:08:35.792 [2024-07-23 05:08:06.670273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.792 [2024-07-23 05:08:06.670309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.792 [2024-07-23 05:08:06.670358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.792 [2024-07-23 05:08:06.670379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.792 [2024-07-23 05:08:06.670448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.792 [2024-07-23 05:08:06.670470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.792 [2024-07-23 05:08:06.670536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:35.792 [2024-07-23 05:08:06.670558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.792 #25 NEW cov: 11774 ft: 13628 corp: 11/391b lim: 50 exec/s: 25 rss: 68Mb L: 45/50 MS: 1 CopyPart- 00:08:35.792 [2024-07-23 05:08:06.730129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.792 [2024-07-23 05:08:06.730168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.792 [2024-07-23 05:08:06.730235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.792 [2024-07-23 05:08:06.730257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.792 #26 NEW cov: 11774 ft: 13660 corp: 12/415b lim: 50 exec/s: 26 rss: 68Mb L: 24/50 MS: 1 CrossOver- 00:08:35.792 [2024-07-23 05:08:06.770461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.792 [2024-07-23 05:08:06.770495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.792 [2024-07-23 05:08:06.770538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.792 [2024-07-23 05:08:06.770565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.792 [2024-07-23 05:08:06.770628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.793 [2024-07-23 05:08:06.770646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.793 #27 NEW cov: 11774 ft: 13669 corp: 13/453b lim: 50 exec/s: 27 rss: 68Mb L: 38/50 MS: 1 ChangeBit- 00:08:35.793 [2024-07-23 05:08:06.830779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.793 [2024-07-23 05:08:06.830813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.793 [2024-07-23 05:08:06.830874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.793 [2024-07-23 05:08:06.830895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.793 [2024-07-23 05:08:06.830959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.793 [2024-07-23 05:08:06.830980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.793 [2024-07-23 05:08:06.831047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:35.793 [2024-07-23 05:08:06.831067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.793 #28 NEW cov: 11774 ft: 13724 corp: 14/498b lim: 50 exec/s: 28 rss: 68Mb L: 45/50 MS: 1 InsertRepeatedBytes- 00:08:35.793 [2024-07-23 05:08:06.880859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:35.793 [2024-07-23 05:08:06.880894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.793 [2024-07-23 05:08:06.880952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:35.793 [2024-07-23 05:08:06.880974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.793 [2024-07-23 05:08:06.881036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:35.793 [2024-07-23 05:08:06.881057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.793 [2024-07-23 05:08:06.881122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:35.793 [2024-07-23 05:08:06.881143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.052 #29 NEW cov: 11774 ft: 13728 corp: 15/544b lim: 50 exec/s: 29 rss: 68Mb L: 46/50 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:36.052 [2024-07-23 05:08:06.940862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.052 [2024-07-23 05:08:06.940895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.052 [2024-07-23 05:08:06.940940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:36.052 [2024-07-23 05:08:06.940960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.052 [2024-07-23 05:08:06.941025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:36.052 [2024-07-23 05:08:06.941047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.052 #30 NEW cov: 11774 ft: 13748 corp: 16/583b lim: 50 exec/s: 30 rss: 68Mb L: 39/50 MS: 1 InsertByte- 00:08:36.052 [2024-07-23 05:08:06.991211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.052 [2024-07-23 05:08:06.991245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.052 [2024-07-23 05:08:06.991305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:36.052 [2024-07-23 05:08:06.991326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.052 [2024-07-23 05:08:06.991389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:36.052 [2024-07-23 05:08:06.991410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.052 [2024-07-23 05:08:06.991476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:36.052 [2024-07-23 05:08:06.991497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.052 #31 NEW cov: 11774 ft: 13782 corp: 17/628b lim: 50 exec/s: 31 rss: 68Mb L: 45/50 MS: 1 CopyPart- 00:08:36.052 [2024-07-23 05:08:07.051344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.052 [2024-07-23 05:08:07.051378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.052 [2024-07-23 05:08:07.051435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:36.052 [2024-07-23 05:08:07.051462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.053 [2024-07-23 05:08:07.051539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:36.053 [2024-07-23 05:08:07.051561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.053 [2024-07-23 05:08:07.051627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:36.053 [2024-07-23 05:08:07.051648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.053 #32 NEW cov: 11774 ft: 13798 corp: 18/675b lim: 50 exec/s: 32 rss: 68Mb L: 47/50 MS: 1 CMP- DE: "\001\000"- 00:08:36.053 [2024-07-23 05:08:07.111543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.053 [2024-07-23 05:08:07.111577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.053 [2024-07-23 05:08:07.111639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:36.053 [2024-07-23 05:08:07.111661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.053 [2024-07-23 05:08:07.111730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:36.053 [2024-07-23 05:08:07.111751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.053 [2024-07-23 05:08:07.111816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:36.053 [2024-07-23 05:08:07.111838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.312 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:36.312 #33 NEW cov: 11797 ft: 13812 corp: 19/722b lim: 50 exec/s: 33 rss: 69Mb L: 47/50 MS: 1 ChangeBinInt- 00:08:36.312 [2024-07-23 05:08:07.171868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.312 [2024-07-23 05:08:07.171903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.312 [2024-07-23 05:08:07.171963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:36.312 [2024-07-23 05:08:07.171984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.312 [2024-07-23 05:08:07.172047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:36.312 [2024-07-23 05:08:07.172068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.312 [2024-07-23 05:08:07.172136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:36.312 [2024-07-23 05:08:07.172157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.312 [2024-07-23 05:08:07.172221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:36.312 [2024-07-23 05:08:07.172241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.312 #34 NEW cov: 11797 ft: 13909 corp: 20/772b lim: 50 exec/s: 34 rss: 69Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:36.312 [2024-07-23 05:08:07.222044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:36.312 [2024-07-23 05:08:07.222078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.312 [2024-07-23 05:08:07.222140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:36.312 [2024-07-23 05:08:07.222161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.312 [2024-07-23 05:08:07.222226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:36.312 [2024-07-23 05:08:07.222247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.312 [2024-07-23 05:08:07.222312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:36.312 [2024-07-23 05:08:07.222332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.312 [2024-07-23 05:08:07.222396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:36.312 [2024-07-23 05:08:07.222417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.312 #35 NEW cov: 11797 ft: 13912 corp: 21/822b lim: 50 exec/s: 17 rss: 69Mb L: 50/50 MS: 1 ChangeByte- 00:08:36.312 #35 DONE cov: 11797 ft: 13912 corp: 21/822b lim: 50 exec/s: 17 rss: 69Mb 00:08:36.312 ###### Recommended dictionary. ###### 00:08:36.312 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:36.312 "\001\000" # Uses: 0 00:08:36.312 ###### End of recommended dictionary. ###### 00:08:36.312 Done 35 runs in 2 second(s) 00:08:36.312 05:08:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:36.312 05:08:07 -- ../common.sh@72 -- # (( i++ )) 00:08:36.312 05:08:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.312 05:08:07 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:36.312 05:08:07 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:36.312 05:08:07 -- nvmf/run.sh@24 -- # local timen=1 00:08:36.312 05:08:07 -- nvmf/run.sh@25 -- # local core=0x1 00:08:36.312 05:08:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:36.312 05:08:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:36.313 05:08:07 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:36.313 05:08:07 -- nvmf/run.sh@29 -- # port=4422 00:08:36.313 05:08:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:36.572 05:08:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:36.572 05:08:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:36.572 05:08:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:36.572 [2024-07-23 05:08:07.440142] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:36.572 [2024-07-23 05:08:07.440211] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3148950 ] 00:08:36.572 EAL: No free 2048 kB hugepages reported on node 1 00:08:36.572 [2024-07-23 05:08:07.656376] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.831 [2024-07-23 05:08:07.732126] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:36.831 [2024-07-23 05:08:07.732299] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.831 [2024-07-23 05:08:07.793271] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:36.831 [2024-07-23 05:08:07.809644] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:36.831 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.831 INFO: Seed: 4081231992 00:08:36.831 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:36.831 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:36.831 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:36.831 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.831 #2 INITED exec/s: 0 rss: 61Mb 00:08:36.831 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.831 This may also happen if the target rejected all inputs we tried so far 00:08:36.831 [2024-07-23 05:08:07.876053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.831 [2024-07-23 05:08:07.876098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.831 [2024-07-23 05:08:07.876237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:36.831 [2024-07-23 05:08:07.876267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.400 NEW_FUNC[1/672]: 0x4a82f0 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:37.400 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:37.400 #5 NEW cov: 11596 ft: 11576 corp: 2/43b lim: 85 exec/s: 0 rss: 67Mb L: 42/42 MS: 3 CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:37.400 [2024-07-23 05:08:08.327174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.400 [2024-07-23 05:08:08.327221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.400 [2024-07-23 05:08:08.327351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.400 [2024-07-23 05:08:08.327378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.400 #6 NEW cov: 11709 ft: 12246 corp: 3/86b lim: 85 exec/s: 0 rss: 67Mb L: 43/43 MS: 1 InsertByte- 00:08:37.400 [2024-07-23 05:08:08.397287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.400 [2024-07-23 05:08:08.397331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.400 [2024-07-23 05:08:08.397457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.400 [2024-07-23 05:08:08.397487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.400 #7 NEW cov: 11715 ft: 12458 corp: 4/135b lim: 85 exec/s: 0 rss: 67Mb L: 49/49 MS: 1 InsertRepeatedBytes- 00:08:37.400 [2024-07-23 05:08:08.467538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.400 [2024-07-23 05:08:08.467577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.400 [2024-07-23 05:08:08.467712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.400 [2024-07-23 05:08:08.467739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.659 #8 NEW cov: 11800 ft: 12814 corp: 5/184b lim: 85 exec/s: 0 rss: 68Mb L: 49/49 MS: 1 ShuffleBytes- 00:08:37.659 [2024-07-23 05:08:08.537753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.659 [2024-07-23 05:08:08.537796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.659 [2024-07-23 05:08:08.537922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.659 [2024-07-23 05:08:08.537952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.659 #9 NEW cov: 11800 ft: 12895 corp: 6/233b lim: 85 exec/s: 0 rss: 68Mb L: 49/49 MS: 1 ChangeByte- 00:08:37.659 [2024-07-23 05:08:08.597965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.659 [2024-07-23 05:08:08.598005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.659 [2024-07-23 05:08:08.598131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.659 [2024-07-23 05:08:08.598161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.659 #10 NEW cov: 11800 ft: 13062 corp: 7/275b lim: 85 exec/s: 0 rss: 68Mb L: 42/49 MS: 1 ChangeByte- 00:08:37.659 [2024-07-23 05:08:08.658151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.659 [2024-07-23 05:08:08.658188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.659 [2024-07-23 05:08:08.658310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.659 [2024-07-23 05:08:08.658341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.659 #11 NEW cov: 11800 ft: 13100 corp: 8/325b lim: 85 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 InsertByte- 00:08:37.659 [2024-07-23 05:08:08.728397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.659 [2024-07-23 05:08:08.728434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.659 [2024-07-23 05:08:08.728531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.659 [2024-07-23 05:08:08.728556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.918 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:37.918 #12 NEW cov: 11823 ft: 13159 corp: 9/367b lim: 85 exec/s: 0 rss: 68Mb L: 42/50 MS: 1 CopyPart- 00:08:37.918 [2024-07-23 05:08:08.788958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.918 [2024-07-23 05:08:08.788991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.918 [2024-07-23 05:08:08.789093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.918 [2024-07-23 05:08:08.789121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.918 [2024-07-23 05:08:08.789251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:37.918 [2024-07-23 05:08:08.789278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.918 #13 NEW cov: 11823 ft: 13524 corp: 10/432b lim: 85 exec/s: 0 rss: 68Mb L: 65/65 MS: 1 CopyPart- 00:08:37.918 [2024-07-23 05:08:08.858800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.918 [2024-07-23 05:08:08.858836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.919 [2024-07-23 05:08:08.858967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.919 [2024-07-23 05:08:08.858992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.919 #17 NEW cov: 11823 ft: 13570 corp: 11/476b lim: 85 exec/s: 17 rss: 68Mb L: 44/65 MS: 4 CopyPart-InsertByte-EraseBytes-CrossOver- 00:08:37.919 [2024-07-23 05:08:08.919613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.919 [2024-07-23 05:08:08.919654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.919 [2024-07-23 05:08:08.919712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.919 [2024-07-23 05:08:08.919738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.919 [2024-07-23 05:08:08.919866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:37.919 [2024-07-23 05:08:08.919890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.919 [2024-07-23 05:08:08.920021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:37.919 [2024-07-23 05:08:08.920049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.919 #18 NEW cov: 11823 ft: 13907 corp: 12/547b lim: 85 exec/s: 18 rss: 68Mb L: 71/71 MS: 1 CopyPart- 00:08:37.919 [2024-07-23 05:08:08.989235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:37.919 [2024-07-23 05:08:08.989273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.919 [2024-07-23 05:08:08.989382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:37.919 [2024-07-23 05:08:08.989408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.178 #19 NEW cov: 11823 ft: 13941 corp: 13/589b lim: 85 exec/s: 19 rss: 69Mb L: 42/71 MS: 1 ChangeByte- 00:08:38.178 [2024-07-23 05:08:09.059326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.178 [2024-07-23 05:08:09.059368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.178 [2024-07-23 05:08:09.059466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.178 [2024-07-23 05:08:09.059487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.178 #20 NEW cov: 11823 ft: 14002 corp: 14/633b lim: 85 exec/s: 20 rss: 69Mb L: 44/71 MS: 1 InsertByte- 00:08:38.178 [2024-07-23 05:08:09.119621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.178 [2024-07-23 05:08:09.119659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.178 [2024-07-23 05:08:09.119775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.178 [2024-07-23 05:08:09.119798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.178 #21 NEW cov: 11823 ft: 14009 corp: 15/682b lim: 85 exec/s: 21 rss: 69Mb L: 49/71 MS: 1 ChangeByte- 00:08:38.178 [2024-07-23 05:08:09.180072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.178 [2024-07-23 05:08:09.180111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.178 [2024-07-23 05:08:09.180194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.178 [2024-07-23 05:08:09.180223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.178 [2024-07-23 05:08:09.180350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.178 [2024-07-23 05:08:09.180374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.178 #22 NEW cov: 11823 ft: 14030 corp: 16/747b lim: 85 exec/s: 22 rss: 69Mb L: 65/71 MS: 1 CopyPart- 00:08:38.178 [2024-07-23 05:08:09.250375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.178 [2024-07-23 05:08:09.250416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.178 [2024-07-23 05:08:09.250485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.178 [2024-07-23 05:08:09.250510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.178 [2024-07-23 05:08:09.250638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.178 [2024-07-23 05:08:09.250664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.470 #23 NEW cov: 11823 ft: 14047 corp: 17/812b lim: 85 exec/s: 23 rss: 69Mb L: 65/71 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:38.470 [2024-07-23 05:08:09.310627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.470 [2024-07-23 05:08:09.310666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.470 [2024-07-23 05:08:09.310758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.470 [2024-07-23 05:08:09.310785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.470 [2024-07-23 05:08:09.310910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.470 [2024-07-23 05:08:09.310938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.470 #24 NEW cov: 11823 ft: 14099 corp: 18/877b lim: 85 exec/s: 24 rss: 69Mb L: 65/71 MS: 1 ShuffleBytes- 00:08:38.470 [2024-07-23 05:08:09.380463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.470 [2024-07-23 05:08:09.380500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.470 [2024-07-23 05:08:09.380581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.470 [2024-07-23 05:08:09.380610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.470 #25 NEW cov: 11823 ft: 14108 corp: 19/919b lim: 85 exec/s: 25 rss: 69Mb L: 42/71 MS: 1 CopyPart- 00:08:38.470 [2024-07-23 05:08:09.441245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.470 [2024-07-23 05:08:09.441284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.470 [2024-07-23 05:08:09.441356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.470 [2024-07-23 05:08:09.441386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.470 [2024-07-23 05:08:09.441508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.470 [2024-07-23 05:08:09.441538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.470 [2024-07-23 05:08:09.441660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:38.471 [2024-07-23 05:08:09.441688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.471 #26 NEW cov: 11823 ft: 14121 corp: 20/987b lim: 85 exec/s: 26 rss: 69Mb L: 68/71 MS: 1 InsertRepeatedBytes- 00:08:38.471 [2024-07-23 05:08:09.511182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.471 [2024-07-23 05:08:09.511225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.471 [2024-07-23 05:08:09.511331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.471 [2024-07-23 05:08:09.511358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.471 [2024-07-23 05:08:09.511480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.471 [2024-07-23 05:08:09.511506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.471 #27 NEW cov: 11823 ft: 14153 corp: 21/1042b lim: 85 exec/s: 27 rss: 69Mb L: 55/71 MS: 1 InsertRepeatedBytes- 00:08:38.730 [2024-07-23 05:08:09.571026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.730 [2024-07-23 05:08:09.571071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.730 [2024-07-23 05:08:09.571188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.730 [2024-07-23 05:08:09.571215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.730 #28 NEW cov: 11823 ft: 14167 corp: 22/1091b lim: 85 exec/s: 28 rss: 69Mb L: 49/71 MS: 1 ChangeBinInt- 00:08:38.730 [2024-07-23 05:08:09.621437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.730 [2024-07-23 05:08:09.621482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.730 [2024-07-23 05:08:09.621541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.730 [2024-07-23 05:08:09.621569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.730 [2024-07-23 05:08:09.621696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.730 [2024-07-23 05:08:09.621727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.730 #34 NEW cov: 11823 ft: 14199 corp: 23/1148b lim: 85 exec/s: 34 rss: 69Mb L: 57/71 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:38.730 [2024-07-23 05:08:09.682064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.730 [2024-07-23 05:08:09.682103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.730 [2024-07-23 05:08:09.682217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.730 [2024-07-23 05:08:09.682242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.730 [2024-07-23 05:08:09.682375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.730 [2024-07-23 05:08:09.682403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.730 [2024-07-23 05:08:09.682530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:38.730 [2024-07-23 05:08:09.682560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.730 #35 NEW cov: 11823 ft: 14232 corp: 24/1219b lim: 85 exec/s: 35 rss: 69Mb L: 71/71 MS: 1 CopyPart- 00:08:38.730 [2024-07-23 05:08:09.751960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.730 [2024-07-23 05:08:09.751999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.730 [2024-07-23 05:08:09.752091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.730 [2024-07-23 05:08:09.752119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.730 [2024-07-23 05:08:09.752247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.730 [2024-07-23 05:08:09.752274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.730 #36 NEW cov: 11823 ft: 14240 corp: 25/1284b lim: 85 exec/s: 36 rss: 69Mb L: 65/71 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:38.731 [2024-07-23 05:08:09.811826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.731 [2024-07-23 05:08:09.811870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.731 [2024-07-23 05:08:09.812005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.731 [2024-07-23 05:08:09.812035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.989 #37 NEW cov: 11823 ft: 14280 corp: 26/1333b lim: 85 exec/s: 37 rss: 70Mb L: 49/71 MS: 1 ShuffleBytes- 00:08:38.989 [2024-07-23 05:08:09.882638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:38.989 [2024-07-23 05:08:09.882677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.989 [2024-07-23 05:08:09.882738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:38.989 [2024-07-23 05:08:09.882763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.989 [2024-07-23 05:08:09.882892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:38.989 [2024-07-23 05:08:09.882918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.989 [2024-07-23 05:08:09.883047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:38.989 [2024-07-23 05:08:09.883074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.989 #38 NEW cov: 11823 ft: 14294 corp: 27/1405b lim: 85 exec/s: 19 rss: 70Mb L: 72/72 MS: 1 InsertByte- 00:08:38.989 #38 DONE cov: 11823 ft: 14294 corp: 27/1405b lim: 85 exec/s: 19 rss: 70Mb 00:08:38.989 ###### Recommended dictionary. ###### 00:08:38.989 "\377\377\377\377\377\377\377\377" # Uses: 2 00:08:38.989 ###### End of recommended dictionary. ###### 00:08:38.990 Done 38 runs in 2 second(s) 00:08:38.990 05:08:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:38.990 05:08:10 -- ../common.sh@72 -- # (( i++ )) 00:08:38.990 05:08:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.990 05:08:10 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:38.990 05:08:10 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:38.990 05:08:10 -- nvmf/run.sh@24 -- # local timen=1 00:08:38.990 05:08:10 -- nvmf/run.sh@25 -- # local core=0x1 00:08:38.990 05:08:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:38.990 05:08:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:38.990 05:08:10 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:38.990 05:08:10 -- nvmf/run.sh@29 -- # port=4423 00:08:38.990 05:08:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:38.990 05:08:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:38.990 05:08:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:38.990 05:08:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:39.249 [2024-07-23 05:08:10.084447] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:39.249 [2024-07-23 05:08:10.084516] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3149494 ] 00:08:39.249 EAL: No free 2048 kB hugepages reported on node 1 00:08:39.249 [2024-07-23 05:08:10.296760] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.508 [2024-07-23 05:08:10.373285] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:39.508 [2024-07-23 05:08:10.373468] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.508 [2024-07-23 05:08:10.434418] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:39.508 [2024-07-23 05:08:10.450772] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:39.508 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.508 INFO: Seed: 2427267827 00:08:39.508 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:39.508 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:39.508 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:39.508 INFO: A corpus is not provided, starting from an empty corpus 00:08:39.508 #2 INITED exec/s: 0 rss: 60Mb 00:08:39.508 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:39.508 This may also happen if the target rejected all inputs we tried so far 00:08:39.508 [2024-07-23 05:08:10.500195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:39.508 [2024-07-23 05:08:10.500233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.509 [2024-07-23 05:08:10.500278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:39.509 [2024-07-23 05:08:10.500297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.509 [2024-07-23 05:08:10.500361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:39.509 [2024-07-23 05:08:10.500382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.077 NEW_FUNC[1/671]: 0x4ab520 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:40.077 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:40.077 #7 NEW cov: 11529 ft: 11529 corp: 2/19b lim: 25 exec/s: 0 rss: 69Mb L: 18/18 MS: 5 InsertByte-ChangeBinInt-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:40.077 [2024-07-23 05:08:10.940993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.077 [2024-07-23 05:08:10.941035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.077 #8 NEW cov: 11642 ft: 12493 corp: 3/28b lim: 25 exec/s: 0 rss: 69Mb L: 9/18 MS: 1 CMP- DE: "\000\000\000\000\002 \217S"- 00:08:40.078 [2024-07-23 05:08:10.991461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.078 [2024-07-23 05:08:10.991495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.078 [2024-07-23 05:08:10.991539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.078 [2024-07-23 05:08:10.991559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.078 [2024-07-23 05:08:10.991621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.078 [2024-07-23 05:08:10.991641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.078 [2024-07-23 05:08:10.991704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:40.078 [2024-07-23 05:08:10.991724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.078 #9 NEW cov: 11648 ft: 13083 corp: 4/51b lim: 25 exec/s: 0 rss: 69Mb L: 23/23 MS: 1 CrossOver- 00:08:40.078 [2024-07-23 05:08:11.051481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.078 [2024-07-23 05:08:11.051520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.078 [2024-07-23 05:08:11.051554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.078 [2024-07-23 05:08:11.051575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.078 [2024-07-23 05:08:11.051639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.078 [2024-07-23 05:08:11.051660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.078 #10 NEW cov: 11733 ft: 13347 corp: 5/68b lim: 25 exec/s: 0 rss: 69Mb L: 17/23 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:40.078 [2024-07-23 05:08:11.111733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.078 [2024-07-23 05:08:11.111768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.078 [2024-07-23 05:08:11.111825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.078 [2024-07-23 05:08:11.111845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.078 [2024-07-23 05:08:11.111909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.078 [2024-07-23 05:08:11.111928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.078 [2024-07-23 05:08:11.111993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:40.078 [2024-07-23 05:08:11.112014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.078 #16 NEW cov: 11733 ft: 13386 corp: 6/90b lim: 25 exec/s: 0 rss: 69Mb L: 22/23 MS: 1 InsertRepeatedBytes- 00:08:40.078 [2024-07-23 05:08:11.151899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.078 [2024-07-23 05:08:11.151935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.078 [2024-07-23 05:08:11.151985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.078 [2024-07-23 05:08:11.152006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.078 [2024-07-23 05:08:11.152070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.078 [2024-07-23 05:08:11.152091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.078 [2024-07-23 05:08:11.152156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:40.078 [2024-07-23 05:08:11.152177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.338 #17 NEW cov: 11733 ft: 13536 corp: 7/113b lim: 25 exec/s: 0 rss: 69Mb L: 23/23 MS: 1 ShuffleBytes- 00:08:40.338 [2024-07-23 05:08:11.212056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.338 [2024-07-23 05:08:11.212090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.338 [2024-07-23 05:08:11.212148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.338 [2024-07-23 05:08:11.212168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.338 [2024-07-23 05:08:11.212233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.338 [2024-07-23 05:08:11.212255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.338 [2024-07-23 05:08:11.212321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:40.338 [2024-07-23 05:08:11.212342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.338 #18 NEW cov: 11733 ft: 13604 corp: 8/136b lim: 25 exec/s: 0 rss: 69Mb L: 23/23 MS: 1 ShuffleBytes- 00:08:40.338 [2024-07-23 05:08:11.262053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.338 [2024-07-23 05:08:11.262087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.338 [2024-07-23 05:08:11.262137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.338 [2024-07-23 05:08:11.262158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.338 [2024-07-23 05:08:11.262223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.338 [2024-07-23 05:08:11.262244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.338 #19 NEW cov: 11733 ft: 13661 corp: 9/154b lim: 25 exec/s: 0 rss: 69Mb L: 18/23 MS: 1 CrossOver- 00:08:40.338 [2024-07-23 05:08:11.311928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.338 [2024-07-23 05:08:11.311963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.338 #20 NEW cov: 11733 ft: 13774 corp: 10/160b lim: 25 exec/s: 0 rss: 70Mb L: 6/23 MS: 1 EraseBytes- 00:08:40.338 [2024-07-23 05:08:11.362121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.338 [2024-07-23 05:08:11.362154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.338 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:40.338 #21 NEW cov: 11756 ft: 13844 corp: 11/166b lim: 25 exec/s: 0 rss: 70Mb L: 6/23 MS: 1 ChangeByte- 00:08:40.338 [2024-07-23 05:08:11.422612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.338 [2024-07-23 05:08:11.422646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.338 [2024-07-23 05:08:11.422704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.338 [2024-07-23 05:08:11.422724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.338 [2024-07-23 05:08:11.422786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.338 [2024-07-23 05:08:11.422806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.338 [2024-07-23 05:08:11.422868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:40.338 [2024-07-23 05:08:11.422888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.598 #22 NEW cov: 11756 ft: 13864 corp: 12/190b lim: 25 exec/s: 0 rss: 70Mb L: 24/24 MS: 1 InsertByte- 00:08:40.598 [2024-07-23 05:08:11.472387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.598 [2024-07-23 05:08:11.472421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.598 #23 NEW cov: 11756 ft: 13954 corp: 13/197b lim: 25 exec/s: 23 rss: 70Mb L: 7/24 MS: 1 EraseBytes- 00:08:40.598 [2024-07-23 05:08:11.512607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.598 [2024-07-23 05:08:11.512641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.598 [2024-07-23 05:08:11.512677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.598 [2024-07-23 05:08:11.512698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.598 #24 NEW cov: 11756 ft: 14165 corp: 14/211b lim: 25 exec/s: 24 rss: 70Mb L: 14/24 MS: 1 PersAutoDict- DE: "\000\000\000\000\002 \217S"- 00:08:40.598 [2024-07-23 05:08:11.562835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.598 [2024-07-23 05:08:11.562868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.598 [2024-07-23 05:08:11.562916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.598 [2024-07-23 05:08:11.562937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.598 [2024-07-23 05:08:11.563002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.598 [2024-07-23 05:08:11.563023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.598 #29 NEW cov: 11756 ft: 14179 corp: 15/229b lim: 25 exec/s: 29 rss: 70Mb L: 18/24 MS: 5 EraseBytes-EraseBytes-EraseBytes-CopyPart-InsertRepeatedBytes- 00:08:40.598 [2024-07-23 05:08:11.613008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.598 [2024-07-23 05:08:11.613042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.598 [2024-07-23 05:08:11.613091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.598 [2024-07-23 05:08:11.613112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.598 [2024-07-23 05:08:11.613177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.598 [2024-07-23 05:08:11.613197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.598 #30 NEW cov: 11756 ft: 14201 corp: 16/248b lim: 25 exec/s: 30 rss: 70Mb L: 19/24 MS: 1 InsertByte- 00:08:40.598 [2024-07-23 05:08:11.663035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.598 [2024-07-23 05:08:11.663068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.598 [2024-07-23 05:08:11.663103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.598 [2024-07-23 05:08:11.663125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.857 #31 NEW cov: 11756 ft: 14236 corp: 17/258b lim: 25 exec/s: 31 rss: 70Mb L: 10/24 MS: 1 InsertRepeatedBytes- 00:08:40.857 [2024-07-23 05:08:11.703273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.857 [2024-07-23 05:08:11.703306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.857 [2024-07-23 05:08:11.703349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.858 [2024-07-23 05:08:11.703369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.858 [2024-07-23 05:08:11.703436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.858 [2024-07-23 05:08:11.703462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.858 #32 NEW cov: 11756 ft: 14274 corp: 18/273b lim: 25 exec/s: 32 rss: 70Mb L: 15/24 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:40.858 [2024-07-23 05:08:11.753203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.858 [2024-07-23 05:08:11.753236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.858 #33 NEW cov: 11756 ft: 14287 corp: 19/279b lim: 25 exec/s: 33 rss: 70Mb L: 6/24 MS: 1 ChangeByte- 00:08:40.858 [2024-07-23 05:08:11.793231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.858 [2024-07-23 05:08:11.793264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.858 #34 NEW cov: 11756 ft: 14375 corp: 20/288b lim: 25 exec/s: 34 rss: 70Mb L: 9/24 MS: 1 ChangeByte- 00:08:40.858 [2024-07-23 05:08:11.833525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.858 [2024-07-23 05:08:11.833559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.858 [2024-07-23 05:08:11.833601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.858 [2024-07-23 05:08:11.833621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.858 #35 NEW cov: 11756 ft: 14396 corp: 21/298b lim: 25 exec/s: 35 rss: 70Mb L: 10/24 MS: 1 ShuffleBytes- 00:08:40.858 [2024-07-23 05:08:11.883992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.858 [2024-07-23 05:08:11.884025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.858 [2024-07-23 05:08:11.884084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:40.858 [2024-07-23 05:08:11.884105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.858 [2024-07-23 05:08:11.884170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:40.858 [2024-07-23 05:08:11.884191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.858 [2024-07-23 05:08:11.884256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:40.858 [2024-07-23 05:08:11.884277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.858 #36 NEW cov: 11756 ft: 14409 corp: 22/320b lim: 25 exec/s: 36 rss: 70Mb L: 22/24 MS: 1 EraseBytes- 00:08:40.858 [2024-07-23 05:08:11.923686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:40.858 [2024-07-23 05:08:11.923719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.117 #37 NEW cov: 11756 ft: 14438 corp: 23/329b lim: 25 exec/s: 37 rss: 70Mb L: 9/24 MS: 1 ChangeBit- 00:08:41.117 [2024-07-23 05:08:11.964224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.117 [2024-07-23 05:08:11.964258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.117 [2024-07-23 05:08:11.964312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.117 [2024-07-23 05:08:11.964339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.117 [2024-07-23 05:08:11.964403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:41.117 [2024-07-23 05:08:11.964423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.117 [2024-07-23 05:08:11.964490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:41.117 [2024-07-23 05:08:11.964512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.117 #38 NEW cov: 11756 ft: 14456 corp: 24/352b lim: 25 exec/s: 38 rss: 70Mb L: 23/24 MS: 1 CMP- DE: "\351\276\215\027`\3430\000"- 00:08:41.117 [2024-07-23 05:08:12.024131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.117 [2024-07-23 05:08:12.024165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.117 [2024-07-23 05:08:12.024218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.117 [2024-07-23 05:08:12.024238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.117 #39 NEW cov: 11756 ft: 14494 corp: 25/362b lim: 25 exec/s: 39 rss: 70Mb L: 10/24 MS: 1 PersAutoDict- DE: "\000\000\000\000\002 \217S"- 00:08:41.117 [2024-07-23 05:08:12.074419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.117 [2024-07-23 05:08:12.074456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.117 [2024-07-23 05:08:12.074499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.117 [2024-07-23 05:08:12.074520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.117 [2024-07-23 05:08:12.074585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:41.117 [2024-07-23 05:08:12.074606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.117 #40 NEW cov: 11756 ft: 14547 corp: 26/378b lim: 25 exec/s: 40 rss: 70Mb L: 16/24 MS: 1 CrossOver- 00:08:41.117 [2024-07-23 05:08:12.124252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.117 [2024-07-23 05:08:12.124285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.117 #41 NEW cov: 11756 ft: 14584 corp: 27/383b lim: 25 exec/s: 41 rss: 70Mb L: 5/24 MS: 1 EraseBytes- 00:08:41.117 [2024-07-23 05:08:12.174647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.117 [2024-07-23 05:08:12.174680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.117 [2024-07-23 05:08:12.174723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.117 [2024-07-23 05:08:12.174743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.117 [2024-07-23 05:08:12.174807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:41.117 [2024-07-23 05:08:12.174828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.377 #42 NEW cov: 11756 ft: 14605 corp: 28/401b lim: 25 exec/s: 42 rss: 70Mb L: 18/24 MS: 1 InsertRepeatedBytes- 00:08:41.377 [2024-07-23 05:08:12.224840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.377 [2024-07-23 05:08:12.224880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.377 [2024-07-23 05:08:12.224926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.377 [2024-07-23 05:08:12.224945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.377 [2024-07-23 05:08:12.225010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:41.377 [2024-07-23 05:08:12.225031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.377 #43 NEW cov: 11756 ft: 14622 corp: 29/419b lim: 25 exec/s: 43 rss: 70Mb L: 18/24 MS: 1 ChangeByte- 00:08:41.377 [2024-07-23 05:08:12.264700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.377 [2024-07-23 05:08:12.264734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.377 #44 NEW cov: 11756 ft: 14633 corp: 30/428b lim: 25 exec/s: 44 rss: 70Mb L: 9/24 MS: 1 ShuffleBytes- 00:08:41.377 [2024-07-23 05:08:12.315129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.377 [2024-07-23 05:08:12.315164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.377 [2024-07-23 05:08:12.315211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.377 [2024-07-23 05:08:12.315232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.377 [2024-07-23 05:08:12.315296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:41.377 [2024-07-23 05:08:12.315317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.377 #45 NEW cov: 11756 ft: 14658 corp: 31/444b lim: 25 exec/s: 45 rss: 70Mb L: 16/24 MS: 1 InsertRepeatedBytes- 00:08:41.377 [2024-07-23 05:08:12.365388] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.377 [2024-07-23 05:08:12.365421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.377 [2024-07-23 05:08:12.365488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.377 [2024-07-23 05:08:12.365510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.377 [2024-07-23 05:08:12.365571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:41.377 [2024-07-23 05:08:12.365591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.377 [2024-07-23 05:08:12.365656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:41.377 [2024-07-23 05:08:12.365677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.377 #46 NEW cov: 11756 ft: 14670 corp: 32/467b lim: 25 exec/s: 46 rss: 70Mb L: 23/24 MS: 1 CrossOver- 00:08:41.377 [2024-07-23 05:08:12.425622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.377 [2024-07-23 05:08:12.425657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.377 [2024-07-23 05:08:12.425705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.377 [2024-07-23 05:08:12.425725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.377 [2024-07-23 05:08:12.425793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:41.377 [2024-07-23 05:08:12.425814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.377 [2024-07-23 05:08:12.425879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:41.377 [2024-07-23 05:08:12.425900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.377 #47 NEW cov: 11756 ft: 14711 corp: 33/491b lim: 25 exec/s: 47 rss: 71Mb L: 24/24 MS: 1 ChangeByte- 00:08:41.637 [2024-07-23 05:08:12.485462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:41.637 [2024-07-23 05:08:12.485497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.637 [2024-07-23 05:08:12.485541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:41.637 [2024-07-23 05:08:12.485561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.637 #48 NEW cov: 11756 ft: 14724 corp: 34/502b lim: 25 exec/s: 24 rss: 71Mb L: 11/24 MS: 1 CopyPart- 00:08:41.637 #48 DONE cov: 11756 ft: 14724 corp: 34/502b lim: 25 exec/s: 24 rss: 71Mb 00:08:41.637 ###### Recommended dictionary. ###### 00:08:41.637 "\000\000\000\000\002 \217S" # Uses: 2 00:08:41.637 "\000\000\000\000\000\000\000\000" # Uses: 1 00:08:41.637 "\351\276\215\027`\3430\000" # Uses: 0 00:08:41.637 ###### End of recommended dictionary. ###### 00:08:41.637 Done 48 runs in 2 second(s) 00:08:41.637 05:08:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:41.637 05:08:12 -- ../common.sh@72 -- # (( i++ )) 00:08:41.637 05:08:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.637 05:08:12 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:41.637 05:08:12 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:41.637 05:08:12 -- nvmf/run.sh@24 -- # local timen=1 00:08:41.637 05:08:12 -- nvmf/run.sh@25 -- # local core=0x1 00:08:41.637 05:08:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:41.637 05:08:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:41.637 05:08:12 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:41.637 05:08:12 -- nvmf/run.sh@29 -- # port=4424 00:08:41.637 05:08:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:41.637 05:08:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:41.637 05:08:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:41.637 05:08:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:41.637 [2024-07-23 05:08:12.691600] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:41.637 [2024-07-23 05:08:12.691670] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3149916 ] 00:08:41.897 EAL: No free 2048 kB hugepages reported on node 1 00:08:41.897 [2024-07-23 05:08:12.915437] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.156 [2024-07-23 05:08:12.991130] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:42.156 [2024-07-23 05:08:12.991307] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.156 [2024-07-23 05:08:13.052262] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:42.156 [2024-07-23 05:08:13.068630] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:42.156 INFO: Running with entropic power schedule (0xFF, 100). 00:08:42.156 INFO: Seed: 750294407 00:08:42.156 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:42.156 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:42.156 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:42.156 INFO: A corpus is not provided, starting from an empty corpus 00:08:42.156 #2 INITED exec/s: 0 rss: 60Mb 00:08:42.156 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:42.156 This may also happen if the target rejected all inputs we tried so far 00:08:42.156 [2024-07-23 05:08:13.124420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.156 [2024-07-23 05:08:13.124465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.156 [2024-07-23 05:08:13.124510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.156 [2024-07-23 05:08:13.124529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.156 [2024-07-23 05:08:13.124595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.156 [2024-07-23 05:08:13.124616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.156 [2024-07-23 05:08:13.124680] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:42.156 [2024-07-23 05:08:13.124702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.091 NEW_FUNC[1/672]: 0x4ac600 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:43.091 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:43.091 #13 NEW cov: 11601 ft: 11602 corp: 2/89b lim: 100 exec/s: 0 rss: 67Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:08:43.091 [2024-07-23 05:08:14.086911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.091 [2024-07-23 05:08:14.086955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.091 [2024-07-23 05:08:14.086992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.091 [2024-07-23 05:08:14.087014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.091 [2024-07-23 05:08:14.087077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.091 [2024-07-23 05:08:14.087095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.091 [2024-07-23 05:08:14.087161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.091 [2024-07-23 05:08:14.087181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.091 #15 NEW cov: 11714 ft: 12135 corp: 3/177b lim: 100 exec/s: 15 rss: 67Mb L: 88/88 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:43.091 [2024-07-23 05:08:14.136923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.091 [2024-07-23 05:08:14.136961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.091 [2024-07-23 05:08:14.137003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.091 [2024-07-23 05:08:14.137022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.091 [2024-07-23 05:08:14.137085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.091 [2024-07-23 05:08:14.137107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.091 [2024-07-23 05:08:14.137171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.091 [2024-07-23 05:08:14.137192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.091 #16 NEW cov: 11720 ft: 12445 corp: 4/265b lim: 100 exec/s: 16 rss: 67Mb L: 88/88 MS: 1 ShuffleBytes- 00:08:43.350 [2024-07-23 05:08:14.187070] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.350 [2024-07-23 05:08:14.187106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.350 [2024-07-23 05:08:14.187164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.350 [2024-07-23 05:08:14.187187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.350 [2024-07-23 05:08:14.187251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.350 [2024-07-23 05:08:14.187273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.350 [2024-07-23 05:08:14.187335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744070559908932 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.350 [2024-07-23 05:08:14.187356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.350 #22 NEW cov: 11805 ft: 12674 corp: 5/364b lim: 100 exec/s: 22 rss: 67Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:43.350 [2024-07-23 05:08:14.247218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.350 [2024-07-23 05:08:14.247254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.350 [2024-07-23 05:08:14.247302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446504380174696447 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.350 [2024-07-23 05:08:14.247323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.350 [2024-07-23 05:08:14.247389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.350 [2024-07-23 05:08:14.247411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.350 [2024-07-23 05:08:14.247481] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.350 [2024-07-23 05:08:14.247511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.350 #23 NEW cov: 11805 ft: 12786 corp: 6/452b lim: 100 exec/s: 23 rss: 67Mb L: 88/99 MS: 1 ChangeByte- 00:08:43.350 [2024-07-23 05:08:14.287388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.287424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.351 [2024-07-23 05:08:14.287487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.287509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.351 [2024-07-23 05:08:14.287576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.287597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.351 [2024-07-23 05:08:14.287660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18391293503297552383 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.287681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.351 #24 NEW cov: 11805 ft: 12864 corp: 7/541b lim: 100 exec/s: 24 rss: 67Mb L: 89/99 MS: 1 InsertByte- 00:08:43.351 [2024-07-23 05:08:14.327522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18377782704415440895 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.327558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.351 [2024-07-23 05:08:14.327606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.327626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.351 [2024-07-23 05:08:14.327690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.327709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.351 [2024-07-23 05:08:14.327773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744070559908932 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.327793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.351 #25 NEW cov: 11805 ft: 12906 corp: 8/640b lim: 100 exec/s: 25 rss: 67Mb L: 99/99 MS: 1 CrossOver- 00:08:43.351 [2024-07-23 05:08:14.387715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.387750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.351 [2024-07-23 05:08:14.387806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.387827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.351 [2024-07-23 05:08:14.387889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.387914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.351 [2024-07-23 05:08:14.387981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.388002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.351 #26 NEW cov: 11805 ft: 12933 corp: 9/733b lim: 100 exec/s: 26 rss: 67Mb L: 93/99 MS: 1 InsertRepeatedBytes- 00:08:43.351 [2024-07-23 05:08:14.427296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3327647950551526958 len:11823 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.351 [2024-07-23 05:08:14.427328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.611 #30 NEW cov: 11805 ft: 13845 corp: 10/771b lim: 100 exec/s: 30 rss: 67Mb L: 38/99 MS: 4 ShuffleBytes-InsertByte-CopyPart-InsertRepeatedBytes- 00:08:43.611 [2024-07-23 05:08:14.477923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.477958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.478018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446504380174696447 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.478039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.478101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.478122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.478186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.478207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.611 #31 NEW cov: 11805 ft: 13881 corp: 11/860b lim: 100 exec/s: 31 rss: 68Mb L: 89/99 MS: 1 InsertByte- 00:08:43.611 [2024-07-23 05:08:14.528084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.528119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.528179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.528200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.528265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.528286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.528350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.528369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.611 #32 NEW cov: 11805 ft: 13927 corp: 12/954b lim: 100 exec/s: 32 rss: 68Mb L: 94/99 MS: 1 InsertByte- 00:08:43.611 [2024-07-23 05:08:14.578226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709488895 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.578260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.578322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.578342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.578404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.578425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.578494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744070559908932 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.578521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.611 #33 NEW cov: 11805 ft: 14023 corp: 13/1053b lim: 100 exec/s: 33 rss: 68Mb L: 99/99 MS: 1 ShuffleBytes- 00:08:43.611 [2024-07-23 05:08:14.627977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.628012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.628051] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744070559909119 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.628071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.611 #34 NEW cov: 11805 ft: 14393 corp: 14/1111b lim: 100 exec/s: 34 rss: 68Mb L: 58/99 MS: 1 EraseBytes- 00:08:43.611 [2024-07-23 05:08:14.678508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.678541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.678601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446504380174696447 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.678622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.678685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.678706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.611 [2024-07-23 05:08:14.678771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.611 [2024-07-23 05:08:14.678792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.871 #35 NEW cov: 11805 ft: 14424 corp: 15/1208b lim: 100 exec/s: 35 rss: 68Mb L: 97/99 MS: 1 InsertRepeatedBytes- 00:08:43.871 [2024-07-23 05:08:14.728693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.871 [2024-07-23 05:08:14.728730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.871 [2024-07-23 05:08:14.728774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.871 [2024-07-23 05:08:14.728794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.871 [2024-07-23 05:08:14.728856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.728878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.728943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744070559908932 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.728962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.872 #36 NEW cov: 11805 ft: 14522 corp: 16/1307b lim: 100 exec/s: 36 rss: 68Mb L: 99/99 MS: 1 ShuffleBytes- 00:08:43.872 [2024-07-23 05:08:14.768758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709488895 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.768793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.768848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.768869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.768931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.768952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.769018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744070559908932 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.769039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.872 #37 NEW cov: 11805 ft: 14543 corp: 17/1406b lim: 100 exec/s: 37 rss: 68Mb L: 99/99 MS: 1 ChangeBinInt- 00:08:43.872 [2024-07-23 05:08:14.818945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.818979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.819027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.819048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.819113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.819133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.819198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.819219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.872 #38 NEW cov: 11805 ft: 14557 corp: 18/1499b lim: 100 exec/s: 38 rss: 68Mb L: 93/99 MS: 1 ChangeBinInt- 00:08:43.872 [2024-07-23 05:08:14.859202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709488895 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.859237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.859300] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.859322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.859384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.859405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.859469] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4971973985467384900 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.859490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.859553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.859573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:43.872 #39 NEW cov: 11805 ft: 14621 corp: 19/1599b lim: 100 exec/s: 39 rss: 68Mb L: 100/100 MS: 1 CopyPart- 00:08:43.872 [2024-07-23 05:08:14.899155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.899189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.899249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.899270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.899334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.899355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.899421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744070559908932 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.899453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.872 #40 NEW cov: 11805 ft: 14705 corp: 20/1698b lim: 100 exec/s: 40 rss: 68Mb L: 99/100 MS: 1 ShuffleBytes- 00:08:43.872 [2024-07-23 05:08:14.949305] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.949339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.949398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.949421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.949487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.949506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.872 [2024-07-23 05:08:14.949569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744070559908932 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.872 [2024-07-23 05:08:14.949590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.132 #41 NEW cov: 11805 ft: 14712 corp: 21/1797b lim: 100 exec/s: 41 rss: 69Mb L: 99/100 MS: 1 ChangeByte- 00:08:44.132 [2024-07-23 05:08:14.999131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.132 [2024-07-23 05:08:14.999165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.132 [2024-07-23 05:08:14.999214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744070559909119 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.132 [2024-07-23 05:08:14.999236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.132 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:44.132 #42 NEW cov: 11828 ft: 14741 corp: 22/1855b lim: 100 exec/s: 42 rss: 69Mb L: 58/100 MS: 1 ChangeByte- 00:08:44.132 [2024-07-23 05:08:15.059648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709488895 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.132 [2024-07-23 05:08:15.059682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.132 [2024-07-23 05:08:15.059743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.132 [2024-07-23 05:08:15.059763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.132 [2024-07-23 05:08:15.059826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.132 [2024-07-23 05:08:15.059847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.132 [2024-07-23 05:08:15.059911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744070559908932 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.132 [2024-07-23 05:08:15.059932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.132 #43 NEW cov: 11828 ft: 14761 corp: 23/1954b lim: 100 exec/s: 43 rss: 69Mb L: 99/100 MS: 1 ChangeByte- 00:08:44.132 [2024-07-23 05:08:15.099727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17726168133330272255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.132 [2024-07-23 05:08:15.099762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.132 [2024-07-23 05:08:15.099820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446504380174696447 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.132 [2024-07-23 05:08:15.099841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.132 [2024-07-23 05:08:15.099909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.132 [2024-07-23 05:08:15.099930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.132 [2024-07-23 05:08:15.099995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.132 [2024-07-23 05:08:15.100015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.132 #44 NEW cov: 11828 ft: 14773 corp: 24/2043b lim: 100 exec/s: 22 rss: 69Mb L: 89/100 MS: 1 ChangeBinInt- 00:08:44.132 #44 DONE cov: 11828 ft: 14773 corp: 24/2043b lim: 100 exec/s: 22 rss: 69Mb 00:08:44.132 Done 44 runs in 2 second(s) 00:08:44.393 05:08:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:44.393 05:08:15 -- ../common.sh@72 -- # (( i++ )) 00:08:44.393 05:08:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.393 05:08:15 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:44.393 00:08:44.393 real 1m5.728s 00:08:44.393 user 1m36.438s 00:08:44.393 sys 0m7.896s 00:08:44.393 05:08:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:44.393 05:08:15 -- common/autotest_common.sh@10 -- # set +x 00:08:44.393 ************************************ 00:08:44.393 END TEST nvmf_fuzz 00:08:44.393 ************************************ 00:08:44.393 05:08:15 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:44.393 05:08:15 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:44.393 05:08:15 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:44.393 05:08:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:44.393 05:08:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:44.393 05:08:15 -- common/autotest_common.sh@10 -- # set +x 00:08:44.393 ************************************ 00:08:44.393 START TEST vfio_fuzz 00:08:44.393 ************************************ 00:08:44.393 05:08:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:44.393 * Looking for test storage... 00:08:44.393 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.393 05:08:15 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:44.393 05:08:15 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:44.393 05:08:15 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:44.393 05:08:15 -- common/autotest_common.sh@34 -- # set -e 00:08:44.393 05:08:15 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:44.393 05:08:15 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:44.393 05:08:15 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:44.393 05:08:15 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:44.393 05:08:15 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:44.393 05:08:15 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:44.393 05:08:15 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:44.393 05:08:15 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:44.393 05:08:15 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:44.393 05:08:15 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:44.393 05:08:15 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:44.393 05:08:15 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:44.393 05:08:15 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:44.393 05:08:15 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:44.393 05:08:15 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:44.393 05:08:15 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:44.393 05:08:15 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:44.393 05:08:15 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:44.393 05:08:15 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:44.393 05:08:15 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:44.393 05:08:15 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:44.393 05:08:15 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:44.393 05:08:15 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:44.393 05:08:15 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:44.393 05:08:15 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:44.393 05:08:15 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:44.393 05:08:15 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:44.393 05:08:15 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:44.393 05:08:15 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:44.393 05:08:15 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:44.393 05:08:15 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:44.393 05:08:15 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:44.393 05:08:15 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:44.393 05:08:15 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:44.393 05:08:15 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:44.393 05:08:15 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:44.393 05:08:15 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:44.393 05:08:15 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:44.393 05:08:15 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:44.393 05:08:15 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:44.393 05:08:15 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:44.393 05:08:15 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:44.393 05:08:15 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:44.393 05:08:15 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:44.393 05:08:15 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:08:44.393 05:08:15 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:44.393 05:08:15 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:44.393 05:08:15 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:44.393 05:08:15 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:44.393 05:08:15 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:44.393 05:08:15 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:44.393 05:08:15 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:44.393 05:08:15 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:44.393 05:08:15 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:44.393 05:08:15 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:44.393 05:08:15 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:44.393 05:08:15 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:44.393 05:08:15 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:44.393 05:08:15 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:44.393 05:08:15 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:44.393 05:08:15 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:44.393 05:08:15 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:44.393 05:08:15 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:44.393 05:08:15 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:44.393 05:08:15 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:08:44.393 05:08:15 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:44.393 05:08:15 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:44.393 05:08:15 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:44.393 05:08:15 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:44.393 05:08:15 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:44.393 05:08:15 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:44.393 05:08:15 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:44.393 05:08:15 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:44.393 05:08:15 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:44.393 05:08:15 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:44.393 05:08:15 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:44.393 05:08:15 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:44.393 05:08:15 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:44.393 05:08:15 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:44.393 05:08:15 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:44.393 05:08:15 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:44.393 05:08:15 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:44.393 05:08:15 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:44.393 05:08:15 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:44.393 05:08:15 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:44.394 05:08:15 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:44.394 05:08:15 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:44.394 05:08:15 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:44.394 05:08:15 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:44.394 05:08:15 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:44.394 05:08:15 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:44.394 05:08:15 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:44.394 05:08:15 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:44.394 05:08:15 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:44.394 05:08:15 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:44.394 05:08:15 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:44.394 05:08:15 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:44.394 05:08:15 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:44.394 05:08:15 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:44.394 #define SPDK_CONFIG_H 00:08:44.394 #define SPDK_CONFIG_APPS 1 00:08:44.394 #define SPDK_CONFIG_ARCH native 00:08:44.394 #undef SPDK_CONFIG_ASAN 00:08:44.394 #undef SPDK_CONFIG_AVAHI 00:08:44.394 #undef SPDK_CONFIG_CET 00:08:44.394 #define SPDK_CONFIG_COVERAGE 1 00:08:44.394 #define SPDK_CONFIG_CROSS_PREFIX 00:08:44.394 #undef SPDK_CONFIG_CRYPTO 00:08:44.394 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:44.394 #undef SPDK_CONFIG_CUSTOMOCF 00:08:44.394 #undef SPDK_CONFIG_DAOS 00:08:44.394 #define SPDK_CONFIG_DAOS_DIR 00:08:44.394 #define SPDK_CONFIG_DEBUG 1 00:08:44.394 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:44.394 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:44.394 #define SPDK_CONFIG_DPDK_INC_DIR 00:08:44.394 #define SPDK_CONFIG_DPDK_LIB_DIR 00:08:44.394 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:44.394 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:44.394 #define SPDK_CONFIG_EXAMPLES 1 00:08:44.394 #undef SPDK_CONFIG_FC 00:08:44.394 #define SPDK_CONFIG_FC_PATH 00:08:44.394 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:44.394 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:44.394 #undef SPDK_CONFIG_FUSE 00:08:44.394 #define SPDK_CONFIG_FUZZER 1 00:08:44.394 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:44.394 #undef SPDK_CONFIG_GOLANG 00:08:44.394 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:44.394 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:44.394 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:44.394 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:44.394 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:44.394 #define SPDK_CONFIG_IDXD 1 00:08:44.394 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:44.394 #undef SPDK_CONFIG_IPSEC_MB 00:08:44.394 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:44.394 #define SPDK_CONFIG_ISAL 1 00:08:44.394 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:44.394 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:44.394 #define SPDK_CONFIG_LIBDIR 00:08:44.394 #undef SPDK_CONFIG_LTO 00:08:44.394 #define SPDK_CONFIG_MAX_LCORES 00:08:44.394 #define SPDK_CONFIG_NVME_CUSE 1 00:08:44.394 #undef SPDK_CONFIG_OCF 00:08:44.394 #define SPDK_CONFIG_OCF_PATH 00:08:44.394 #define SPDK_CONFIG_OPENSSL_PATH 00:08:44.394 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:44.394 #undef SPDK_CONFIG_PGO_USE 00:08:44.394 #define SPDK_CONFIG_PREFIX /usr/local 00:08:44.394 #undef SPDK_CONFIG_RAID5F 00:08:44.394 #undef SPDK_CONFIG_RBD 00:08:44.394 #define SPDK_CONFIG_RDMA 1 00:08:44.394 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:44.394 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:44.394 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:44.394 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:44.394 #undef SPDK_CONFIG_SHARED 00:08:44.394 #undef SPDK_CONFIG_SMA 00:08:44.394 #define SPDK_CONFIG_TESTS 1 00:08:44.394 #undef SPDK_CONFIG_TSAN 00:08:44.394 #define SPDK_CONFIG_UBLK 1 00:08:44.394 #define SPDK_CONFIG_UBSAN 1 00:08:44.394 #undef SPDK_CONFIG_UNIT_TESTS 00:08:44.394 #undef SPDK_CONFIG_URING 00:08:44.394 #define SPDK_CONFIG_URING_PATH 00:08:44.394 #undef SPDK_CONFIG_URING_ZNS 00:08:44.394 #undef SPDK_CONFIG_USDT 00:08:44.394 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:44.394 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:44.394 #define SPDK_CONFIG_VFIO_USER 1 00:08:44.394 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:44.394 #define SPDK_CONFIG_VHOST 1 00:08:44.394 #define SPDK_CONFIG_VIRTIO 1 00:08:44.394 #undef SPDK_CONFIG_VTUNE 00:08:44.394 #define SPDK_CONFIG_VTUNE_DIR 00:08:44.394 #define SPDK_CONFIG_WERROR 1 00:08:44.394 #define SPDK_CONFIG_WPDK_DIR 00:08:44.394 #undef SPDK_CONFIG_XNVME 00:08:44.394 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:44.394 05:08:15 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:44.394 05:08:15 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:44.394 05:08:15 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:44.394 05:08:15 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:44.394 05:08:15 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:44.394 05:08:15 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:44.394 05:08:15 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:44.394 05:08:15 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:44.394 05:08:15 -- paths/export.sh@5 -- # export PATH 00:08:44.394 05:08:15 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:44.394 05:08:15 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:44.394 05:08:15 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:44.394 05:08:15 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:44.394 05:08:15 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:44.394 05:08:15 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:44.394 05:08:15 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:44.394 05:08:15 -- pm/common@16 -- # TEST_TAG=N/A 00:08:44.394 05:08:15 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:44.394 05:08:15 -- common/autotest_common.sh@52 -- # : 1 00:08:44.394 05:08:15 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:44.394 05:08:15 -- common/autotest_common.sh@56 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:44.394 05:08:15 -- common/autotest_common.sh@58 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:44.394 05:08:15 -- common/autotest_common.sh@60 -- # : 1 00:08:44.394 05:08:15 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:44.394 05:08:15 -- common/autotest_common.sh@62 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:44.394 05:08:15 -- common/autotest_common.sh@64 -- # : 00:08:44.394 05:08:15 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:44.394 05:08:15 -- common/autotest_common.sh@66 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:44.394 05:08:15 -- common/autotest_common.sh@68 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:44.394 05:08:15 -- common/autotest_common.sh@70 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:44.394 05:08:15 -- common/autotest_common.sh@72 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:44.394 05:08:15 -- common/autotest_common.sh@74 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:44.394 05:08:15 -- common/autotest_common.sh@76 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:44.394 05:08:15 -- common/autotest_common.sh@78 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:44.394 05:08:15 -- common/autotest_common.sh@80 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:44.394 05:08:15 -- common/autotest_common.sh@82 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:44.394 05:08:15 -- common/autotest_common.sh@84 -- # : 0 00:08:44.394 05:08:15 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:44.394 05:08:15 -- common/autotest_common.sh@86 -- # : 0 00:08:44.395 05:08:15 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:44.395 05:08:15 -- common/autotest_common.sh@88 -- # : 0 00:08:44.395 05:08:15 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:44.395 05:08:15 -- common/autotest_common.sh@90 -- # : 0 00:08:44.395 05:08:15 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:44.395 05:08:15 -- common/autotest_common.sh@92 -- # : 1 00:08:44.395 05:08:15 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:44.395 05:08:15 -- common/autotest_common.sh@94 -- # : 1 00:08:44.395 05:08:15 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:44.395 05:08:15 -- common/autotest_common.sh@96 -- # : rdma 00:08:44.395 05:08:15 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:44.395 05:08:15 -- common/autotest_common.sh@98 -- # : 0 00:08:44.395 05:08:15 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:44.395 05:08:15 -- common/autotest_common.sh@100 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:44.655 05:08:15 -- common/autotest_common.sh@102 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:44.655 05:08:15 -- common/autotest_common.sh@104 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:44.655 05:08:15 -- common/autotest_common.sh@106 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:44.655 05:08:15 -- common/autotest_common.sh@108 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:44.655 05:08:15 -- common/autotest_common.sh@110 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:44.655 05:08:15 -- common/autotest_common.sh@112 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:44.655 05:08:15 -- common/autotest_common.sh@114 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:44.655 05:08:15 -- common/autotest_common.sh@116 -- # : 1 00:08:44.655 05:08:15 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:44.655 05:08:15 -- common/autotest_common.sh@118 -- # : 00:08:44.655 05:08:15 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:44.655 05:08:15 -- common/autotest_common.sh@120 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:44.655 05:08:15 -- common/autotest_common.sh@122 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:44.655 05:08:15 -- common/autotest_common.sh@124 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:44.655 05:08:15 -- common/autotest_common.sh@126 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:44.655 05:08:15 -- common/autotest_common.sh@128 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:44.655 05:08:15 -- common/autotest_common.sh@130 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:44.655 05:08:15 -- common/autotest_common.sh@132 -- # : 00:08:44.655 05:08:15 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:44.655 05:08:15 -- common/autotest_common.sh@134 -- # : true 00:08:44.655 05:08:15 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:44.655 05:08:15 -- common/autotest_common.sh@136 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:44.655 05:08:15 -- common/autotest_common.sh@138 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:44.655 05:08:15 -- common/autotest_common.sh@140 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:44.655 05:08:15 -- common/autotest_common.sh@142 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:44.655 05:08:15 -- common/autotest_common.sh@144 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:44.655 05:08:15 -- common/autotest_common.sh@146 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:44.655 05:08:15 -- common/autotest_common.sh@148 -- # : 00:08:44.655 05:08:15 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:44.655 05:08:15 -- common/autotest_common.sh@150 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:44.655 05:08:15 -- common/autotest_common.sh@152 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:44.655 05:08:15 -- common/autotest_common.sh@154 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:44.655 05:08:15 -- common/autotest_common.sh@156 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:44.655 05:08:15 -- common/autotest_common.sh@158 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:44.655 05:08:15 -- common/autotest_common.sh@160 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:44.655 05:08:15 -- common/autotest_common.sh@163 -- # : 00:08:44.655 05:08:15 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:44.655 05:08:15 -- common/autotest_common.sh@165 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:44.655 05:08:15 -- common/autotest_common.sh@167 -- # : 0 00:08:44.655 05:08:15 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:44.655 05:08:15 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:44.656 05:08:15 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:44.656 05:08:15 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:44.656 05:08:15 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:44.656 05:08:15 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:44.656 05:08:15 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:44.656 05:08:15 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:44.656 05:08:15 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:44.656 05:08:15 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:44.656 05:08:15 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:44.656 05:08:15 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:44.656 05:08:15 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:44.656 05:08:15 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:44.656 05:08:15 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:44.656 05:08:15 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:44.656 05:08:15 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:44.656 05:08:15 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:44.656 05:08:15 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:44.656 05:08:15 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:44.656 05:08:15 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:44.656 05:08:15 -- common/autotest_common.sh@196 -- # cat 00:08:44.656 05:08:15 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:44.656 05:08:15 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:44.656 05:08:15 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:44.656 05:08:15 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:44.656 05:08:15 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:44.656 05:08:15 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:44.656 05:08:15 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:44.656 05:08:15 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:44.656 05:08:15 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:44.656 05:08:15 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:44.656 05:08:15 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:44.656 05:08:15 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:44.656 05:08:15 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:44.656 05:08:15 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:44.656 05:08:15 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:44.656 05:08:15 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:44.656 05:08:15 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:44.656 05:08:15 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:44.656 05:08:15 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:44.656 05:08:15 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:08:44.656 05:08:15 -- common/autotest_common.sh@249 -- # export valgrind= 00:08:44.656 05:08:15 -- common/autotest_common.sh@249 -- # valgrind= 00:08:44.656 05:08:15 -- common/autotest_common.sh@255 -- # uname -s 00:08:44.656 05:08:15 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:08:44.656 05:08:15 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:08:44.656 05:08:15 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:08:44.656 05:08:15 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:08:44.656 05:08:15 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:44.656 05:08:15 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:44.656 05:08:15 -- common/autotest_common.sh@265 -- # MAKE=make 00:08:44.656 05:08:15 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:08:44.656 05:08:15 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:08:44.656 05:08:15 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:08:44.656 05:08:15 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:44.656 05:08:15 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:08:44.656 05:08:15 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:08:44.656 05:08:15 -- common/autotest_common.sh@309 -- # [[ -z 3150362 ]] 00:08:44.656 05:08:15 -- common/autotest_common.sh@309 -- # kill -0 3150362 00:08:44.656 05:08:15 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:08:44.656 05:08:15 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:08:44.656 05:08:15 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:08:44.656 05:08:15 -- common/autotest_common.sh@322 -- # local mount target_dir 00:08:44.656 05:08:15 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:08:44.656 05:08:15 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:08:44.656 05:08:15 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:08:44.656 05:08:15 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:08:44.656 05:08:15 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.w0HxaA 00:08:44.656 05:08:15 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:44.656 05:08:15 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:08:44.656 05:08:15 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:08:44.656 05:08:15 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.w0HxaA/tests/vfio /tmp/spdk.w0HxaA 00:08:44.656 05:08:15 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:08:44.656 05:08:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:44.656 05:08:15 -- common/autotest_common.sh@318 -- # df -T 00:08:44.656 05:08:15 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:08:44.656 05:08:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:08:44.656 05:08:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:08:44.656 05:08:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:08:44.656 05:08:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:08:44.656 05:08:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:08:44.656 05:08:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:44.656 05:08:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:08:44.656 05:08:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:08:44.656 05:08:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:08:44.656 05:08:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:08:44.656 05:08:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:08:44.656 05:08:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:44.656 05:08:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:08:44.656 05:08:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:08:44.656 05:08:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=49152442368 00:08:44.656 05:08:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:08:44.656 05:08:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=12589875200 00:08:44.656 05:08:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:44.656 05:08:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:44.656 05:08:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:44.656 05:08:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:08:44.656 05:08:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:44.656 05:08:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:08:44.656 05:08:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:44.656 05:08:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:44.656 05:08:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:44.656 05:08:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342489088 00:08:44.656 05:08:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:08:44.657 05:08:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=5976064 00:08:44.657 05:08:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:44.657 05:08:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:44.657 05:08:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:44.657 05:08:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868627456 00:08:44.657 05:08:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:44.657 05:08:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=2531328 00:08:44.657 05:08:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:44.657 05:08:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:44.657 05:08:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:44.657 05:08:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:08:44.657 05:08:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:08:44.657 05:08:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:08:44.657 05:08:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:44.657 05:08:15 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:08:44.657 * Looking for test storage... 00:08:44.657 05:08:15 -- common/autotest_common.sh@359 -- # local target_space new_size 00:08:44.657 05:08:15 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:08:44.657 05:08:15 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.657 05:08:15 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:44.657 05:08:15 -- common/autotest_common.sh@363 -- # mount=/ 00:08:44.657 05:08:15 -- common/autotest_common.sh@365 -- # target_space=49152442368 00:08:44.657 05:08:15 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:08:44.657 05:08:15 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:08:44.657 05:08:15 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:08:44.657 05:08:15 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:08:44.657 05:08:15 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:08:44.657 05:08:15 -- common/autotest_common.sh@372 -- # new_size=14804467712 00:08:44.657 05:08:15 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:44.657 05:08:15 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.657 05:08:15 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.657 05:08:15 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.657 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:44.657 05:08:15 -- common/autotest_common.sh@380 -- # return 0 00:08:44.657 05:08:15 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:08:44.657 05:08:15 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:08:44.657 05:08:15 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:44.657 05:08:15 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:44.657 05:08:15 -- common/autotest_common.sh@1672 -- # true 00:08:44.657 05:08:15 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:08:44.657 05:08:15 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:44.657 05:08:15 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:44.657 05:08:15 -- common/autotest_common.sh@27 -- # exec 00:08:44.657 05:08:15 -- common/autotest_common.sh@29 -- # exec 00:08:44.657 05:08:15 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:44.657 05:08:15 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:44.657 05:08:15 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:44.657 05:08:15 -- common/autotest_common.sh@18 -- # set -x 00:08:44.657 05:08:15 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:44.657 05:08:15 -- ../common.sh@8 -- # pids=() 00:08:44.657 05:08:15 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:44.657 05:08:15 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:44.657 05:08:15 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:44.657 05:08:15 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:44.657 05:08:15 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:44.657 05:08:15 -- vfio/run.sh@65 -- # mem_size=0 00:08:44.657 05:08:15 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:44.657 05:08:15 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:44.657 05:08:15 -- ../common.sh@69 -- # local fuzz_num=7 00:08:44.657 05:08:15 -- ../common.sh@70 -- # local time=1 00:08:44.657 05:08:15 -- ../common.sh@72 -- # (( i = 0 )) 00:08:44.657 05:08:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.657 05:08:15 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:44.657 05:08:15 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:44.657 05:08:15 -- vfio/run.sh@23 -- # local timen=1 00:08:44.657 05:08:15 -- vfio/run.sh@24 -- # local core=0x1 00:08:44.657 05:08:15 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:44.657 05:08:15 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:44.657 05:08:15 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:44.657 05:08:15 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:44.657 05:08:15 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:44.657 05:08:15 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:44.657 05:08:15 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:44.657 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:44.657 05:08:15 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:44.657 [2024-07-23 05:08:15.619251] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:44.657 [2024-07-23 05:08:15.619332] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3150436 ] 00:08:44.657 EAL: No free 2048 kB hugepages reported on node 1 00:08:44.657 [2024-07-23 05:08:15.724559] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.917 [2024-07-23 05:08:15.808134] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:44.917 [2024-07-23 05:08:15.808322] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.917 INFO: Running with entropic power schedule (0xFF, 100). 00:08:44.917 INFO: Seed: 3672304207 00:08:45.176 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:45.176 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:45.176 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:45.176 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.176 #2 INITED exec/s: 0 rss: 61Mb 00:08:45.176 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.176 This may also happen if the target rejected all inputs we tried so far 00:08:45.744 NEW_FUNC[1/631]: 0x4806f0 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:45.744 NEW_FUNC[2/631]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:45.744 #12 NEW cov: 10706 ft: 10465 corp: 2/7b lim: 60 exec/s: 0 rss: 66Mb L: 6/6 MS: 5 InsertByte-ShuffleBytes-CrossOver-CopyPart-InsertByte- 00:08:46.003 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:46.003 #13 NEW cov: 10747 ft: 13362 corp: 3/17b lim: 60 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:08:46.262 #16 NEW cov: 10747 ft: 15493 corp: 4/74b lim: 60 exec/s: 16 rss: 69Mb L: 57/57 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:46.262 #17 NEW cov: 10747 ft: 15880 corp: 5/80b lim: 60 exec/s: 17 rss: 69Mb L: 6/57 MS: 1 ChangeBit- 00:08:46.521 #18 NEW cov: 10747 ft: 16353 corp: 6/90b lim: 60 exec/s: 18 rss: 69Mb L: 10/57 MS: 1 ChangeBinInt- 00:08:46.780 #19 NEW cov: 10747 ft: 16536 corp: 7/139b lim: 60 exec/s: 19 rss: 69Mb L: 49/57 MS: 1 InsertRepeatedBytes- 00:08:47.039 #20 NEW cov: 10754 ft: 16880 corp: 8/188b lim: 60 exec/s: 20 rss: 69Mb L: 49/57 MS: 1 ChangeBit- 00:08:47.298 #21 NEW cov: 10754 ft: 16942 corp: 9/217b lim: 60 exec/s: 10 rss: 69Mb L: 29/57 MS: 1 EraseBytes- 00:08:47.298 #21 DONE cov: 10754 ft: 16942 corp: 9/217b lim: 60 exec/s: 10 rss: 69Mb 00:08:47.298 Done 21 runs in 2 second(s) 00:08:47.557 05:08:18 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:47.557 05:08:18 -- ../common.sh@72 -- # (( i++ )) 00:08:47.557 05:08:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.557 05:08:18 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:47.557 05:08:18 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:47.557 05:08:18 -- vfio/run.sh@23 -- # local timen=1 00:08:47.557 05:08:18 -- vfio/run.sh@24 -- # local core=0x1 00:08:47.557 05:08:18 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:47.557 05:08:18 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:47.557 05:08:18 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:47.557 05:08:18 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:47.557 05:08:18 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:47.557 05:08:18 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:47.557 05:08:18 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:47.557 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:47.557 05:08:18 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:47.557 [2024-07-23 05:08:18.522416] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:47.557 [2024-07-23 05:08:18.522502] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3150948 ] 00:08:47.557 EAL: No free 2048 kB hugepages reported on node 1 00:08:47.557 [2024-07-23 05:08:18.628433] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.817 [2024-07-23 05:08:18.711273] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:47.817 [2024-07-23 05:08:18.711463] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.817 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.817 INFO: Seed: 2280328315 00:08:48.076 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:48.076 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:48.076 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:48.076 INFO: A corpus is not provided, starting from an empty corpus 00:08:48.076 #2 INITED exec/s: 0 rss: 61Mb 00:08:48.076 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:48.076 This may also happen if the target rejected all inputs we tried so far 00:08:48.076 [2024-07-23 05:08:19.017509] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:48.076 [2024-07-23 05:08:19.017544] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:48.076 [2024-07-23 05:08:19.017630] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:49.272 NEW_FUNC[1/638]: 0x480c90 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:49.272 NEW_FUNC[2/638]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:49.272 #26 NEW cov: 10730 ft: 10693 corp: 2/8b lim: 40 exec/s: 26 rss: 66Mb L: 7/7 MS: 4 CopyPart-ChangeBinInt-InsertByte-CMP- DE: "\377\377\377\377"- 00:08:49.272 [2024-07-23 05:08:20.174111] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:49.272 [2024-07-23 05:08:20.174160] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:49.272 [2024-07-23 05:08:20.174185] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:49.272 #27 NEW cov: 10745 ft: 14294 corp: 3/13b lim: 40 exec/s: 27 rss: 68Mb L: 5/7 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:49.530 [2024-07-23 05:08:20.416839] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:49.530 [2024-07-23 05:08:20.416870] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:49.530 [2024-07-23 05:08:20.416896] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:49.530 #28 NEW cov: 10745 ft: 15328 corp: 4/24b lim: 40 exec/s: 28 rss: 69Mb L: 11/11 MS: 1 InsertRepeatedBytes- 00:08:49.789 [2024-07-23 05:08:20.637428] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:49.789 [2024-07-23 05:08:20.637470] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:49.789 [2024-07-23 05:08:20.637496] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:49.789 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:49.789 #29 NEW cov: 10768 ft: 15912 corp: 5/31b lim: 40 exec/s: 29 rss: 69Mb L: 7/11 MS: 1 ShuffleBytes- 00:08:49.789 [2024-07-23 05:08:20.860579] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:49.789 [2024-07-23 05:08:20.860607] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:49.789 [2024-07-23 05:08:20.860630] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:50.047 #30 NEW cov: 10768 ft: 16175 corp: 6/37b lim: 40 exec/s: 15 rss: 69Mb L: 6/11 MS: 1 EraseBytes- 00:08:50.047 #30 DONE cov: 10768 ft: 16175 corp: 6/37b lim: 40 exec/s: 15 rss: 69Mb 00:08:50.047 ###### Recommended dictionary. ###### 00:08:50.047 "\377\377\377\377" # Uses: 1 00:08:50.047 ###### End of recommended dictionary. ###### 00:08:50.047 Done 30 runs in 2 second(s) 00:08:50.306 05:08:21 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:50.306 05:08:21 -- ../common.sh@72 -- # (( i++ )) 00:08:50.306 05:08:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.306 05:08:21 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:50.306 05:08:21 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:50.306 05:08:21 -- vfio/run.sh@23 -- # local timen=1 00:08:50.306 05:08:21 -- vfio/run.sh@24 -- # local core=0x1 00:08:50.306 05:08:21 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:50.306 05:08:21 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:50.306 05:08:21 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:50.306 05:08:21 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:50.306 05:08:21 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:50.306 05:08:21 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:50.306 05:08:21 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:50.306 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:50.306 05:08:21 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:50.306 [2024-07-23 05:08:21.333151] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:50.306 [2024-07-23 05:08:21.333237] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3151486 ] 00:08:50.306 EAL: No free 2048 kB hugepages reported on node 1 00:08:50.565 [2024-07-23 05:08:21.438160] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.565 [2024-07-23 05:08:21.519999] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:50.565 [2024-07-23 05:08:21.520191] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.862 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.862 INFO: Seed: 789360215 00:08:50.862 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:50.862 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:50.862 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:50.862 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.862 #2 INITED exec/s: 0 rss: 61Mb 00:08:50.862 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.862 This may also happen if the target rejected all inputs we tried so far 00:08:50.862 [2024-07-23 05:08:21.834758] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:52.055 NEW_FUNC[1/636]: 0x481670 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:52.055 NEW_FUNC[2/636]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:52.055 #5 NEW cov: 10707 ft: 10649 corp: 2/24b lim: 80 exec/s: 5 rss: 66Mb L: 23/23 MS: 3 CopyPart-CrossOver-InsertRepeatedBytes- 00:08:52.055 [2024-07-23 05:08:22.983308] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:52.055 #6 NEW cov: 10721 ft: 13712 corp: 3/47b lim: 80 exec/s: 6 rss: 68Mb L: 23/23 MS: 1 ChangeByte- 00:08:52.314 [2024-07-23 05:08:23.209133] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:52.314 #7 NEW cov: 10721 ft: 14380 corp: 4/70b lim: 80 exec/s: 7 rss: 69Mb L: 23/23 MS: 1 ChangeBinInt- 00:08:52.573 [2024-07-23 05:08:23.434288] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:52.573 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:52.573 #8 NEW cov: 10744 ft: 14580 corp: 5/98b lim: 80 exec/s: 8 rss: 69Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:52.573 [2024-07-23 05:08:23.659689] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:52.832 #9 NEW cov: 10744 ft: 15151 corp: 6/122b lim: 80 exec/s: 4 rss: 69Mb L: 24/28 MS: 1 InsertByte- 00:08:52.832 #9 DONE cov: 10744 ft: 15151 corp: 6/122b lim: 80 exec/s: 4 rss: 69Mb 00:08:52.832 Done 9 runs in 2 second(s) 00:08:53.092 05:08:24 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:53.092 05:08:24 -- ../common.sh@72 -- # (( i++ )) 00:08:53.092 05:08:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:53.092 05:08:24 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:53.092 05:08:24 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:53.092 05:08:24 -- vfio/run.sh@23 -- # local timen=1 00:08:53.092 05:08:24 -- vfio/run.sh@24 -- # local core=0x1 00:08:53.092 05:08:24 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:53.092 05:08:24 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:53.092 05:08:24 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:53.092 05:08:24 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:53.092 05:08:24 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:53.092 05:08:24 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:53.092 05:08:24 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:53.092 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:53.092 05:08:24 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:53.092 [2024-07-23 05:08:24.133531] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:53.092 [2024-07-23 05:08:24.133608] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3152036 ] 00:08:53.092 EAL: No free 2048 kB hugepages reported on node 1 00:08:53.351 [2024-07-23 05:08:24.236317] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.351 [2024-07-23 05:08:24.318750] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:53.351 [2024-07-23 05:08:24.318935] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.611 INFO: Running with entropic power schedule (0xFF, 100). 00:08:53.611 INFO: Seed: 3590368239 00:08:53.611 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:53.611 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:53.611 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:53.611 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.611 #2 INITED exec/s: 0 rss: 61Mb 00:08:53.611 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.611 This may also happen if the target rejected all inputs we tried so far 00:08:54.807 NEW_FUNC[1/632]: 0x481d50 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:54.807 NEW_FUNC[2/632]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:54.807 #4 NEW cov: 10700 ft: 10665 corp: 2/40b lim: 320 exec/s: 4 rss: 66Mb L: 39/39 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:54.807 #5 NEW cov: 10717 ft: 14462 corp: 3/96b lim: 320 exec/s: 5 rss: 68Mb L: 56/56 MS: 1 CopyPart- 00:08:55.067 #6 NEW cov: 10717 ft: 14715 corp: 4/152b lim: 320 exec/s: 6 rss: 69Mb L: 56/56 MS: 1 ChangeBinInt- 00:08:55.327 #7 NEW cov: 10717 ft: 15071 corp: 5/209b lim: 320 exec/s: 7 rss: 69Mb L: 57/57 MS: 1 InsertByte- 00:08:55.327 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:55.327 #8 NEW cov: 10740 ft: 15313 corp: 6/265b lim: 320 exec/s: 8 rss: 69Mb L: 56/57 MS: 1 ChangeBit- 00:08:55.586 #9 NEW cov: 10740 ft: 15894 corp: 7/346b lim: 320 exec/s: 4 rss: 69Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:08:55.586 #9 DONE cov: 10740 ft: 15894 corp: 7/346b lim: 320 exec/s: 4 rss: 69Mb 00:08:55.586 Done 9 runs in 2 second(s) 00:08:55.845 05:08:26 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:55.845 05:08:26 -- ../common.sh@72 -- # (( i++ )) 00:08:55.845 05:08:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.845 05:08:26 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:55.845 05:08:26 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:55.845 05:08:26 -- vfio/run.sh@23 -- # local timen=1 00:08:55.845 05:08:26 -- vfio/run.sh@24 -- # local core=0x1 00:08:55.845 05:08:26 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:55.845 05:08:26 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:55.845 05:08:26 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:55.845 05:08:26 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:55.845 05:08:26 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:55.845 05:08:26 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:55.845 05:08:26 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:55.845 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:55.845 05:08:26 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:55.845 [2024-07-23 05:08:26.890819] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:55.845 [2024-07-23 05:08:26.890888] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3152573 ] 00:08:55.845 EAL: No free 2048 kB hugepages reported on node 1 00:08:56.103 [2024-07-23 05:08:26.993764] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.103 [2024-07-23 05:08:27.075561] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:56.103 [2024-07-23 05:08:27.075745] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.362 INFO: Running with entropic power schedule (0xFF, 100). 00:08:56.362 INFO: Seed: 2046401850 00:08:56.362 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:56.362 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:56.362 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:56.362 INFO: A corpus is not provided, starting from an empty corpus 00:08:56.362 #2 INITED exec/s: 0 rss: 61Mb 00:08:56.362 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:56.362 This may also happen if the target rejected all inputs we tried so far 00:08:56.362 [2024-07-23 05:08:27.364511] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:08:56.362 [2024-07-23 05:08:27.364554] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:56.362 [2024-07-23 05:08:27.364570] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:56.362 [2024-07-23 05:08:27.364594] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:56.362 [2024-07-23 05:08:27.365503] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:56.362 [2024-07-23 05:08:27.365528] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:56.362 [2024-07-23 05:08:27.365551] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:56.880 NEW_FUNC[1/638]: 0x4825d0 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:56.880 NEW_FUNC[2/638]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:56.880 #4 NEW cov: 10730 ft: 10702 corp: 2/67b lim: 320 exec/s: 0 rss: 66Mb L: 66/66 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:56.880 [2024-07-23 05:08:27.930747] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:56.880 [2024-07-23 05:08:27.930792] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:56.880 [2024-07-23 05:08:27.930808] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:56.880 [2024-07-23 05:08:27.930835] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:56.880 [2024-07-23 05:08:27.931761] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:56.880 [2024-07-23 05:08:27.931786] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:56.880 [2024-07-23 05:08:27.931812] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:57.138 #10 NEW cov: 10747 ft: 13309 corp: 3/231b lim: 320 exec/s: 0 rss: 68Mb L: 164/164 MS: 1 InsertRepeatedBytes- 00:08:57.138 [2024-07-23 05:08:28.089055] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:57.138 [2024-07-23 05:08:28.089087] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:57.138 [2024-07-23 05:08:28.089103] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:57.138 [2024-07-23 05:08:28.089128] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:57.138 [2024-07-23 05:08:28.090062] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:57.138 [2024-07-23 05:08:28.090088] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:57.138 [2024-07-23 05:08:28.090112] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:57.138 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:57.138 #11 NEW cov: 10764 ft: 14386 corp: 4/398b lim: 320 exec/s: 0 rss: 69Mb L: 167/167 MS: 1 CrossOver- 00:08:57.397 [2024-07-23 05:08:28.245718] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:57.397 [2024-07-23 05:08:28.245751] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:57.397 [2024-07-23 05:08:28.245766] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:57.397 [2024-07-23 05:08:28.245791] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:57.397 [2024-07-23 05:08:28.246716] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:57.397 [2024-07-23 05:08:28.246742] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:57.397 [2024-07-23 05:08:28.246766] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:57.397 #12 NEW cov: 10764 ft: 15388 corp: 5/464b lim: 320 exec/s: 12 rss: 69Mb L: 66/167 MS: 1 ShuffleBytes- 00:08:57.397 [2024-07-23 05:08:28.402323] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:57.397 [2024-07-23 05:08:28.402354] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:57.397 [2024-07-23 05:08:28.402370] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:57.397 [2024-07-23 05:08:28.402395] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:57.397 [2024-07-23 05:08:28.403320] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:57.397 [2024-07-23 05:08:28.403346] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:57.397 [2024-07-23 05:08:28.403371] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:57.656 #13 NEW cov: 10764 ft: 15522 corp: 6/628b lim: 320 exec/s: 13 rss: 69Mb L: 164/167 MS: 1 ShuffleBytes- 00:08:57.656 [2024-07-23 05:08:28.548817] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:57.656 [2024-07-23 05:08:28.548849] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:57.656 [2024-07-23 05:08:28.548865] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:57.656 [2024-07-23 05:08:28.548889] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:57.656 [2024-07-23 05:08:28.549866] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:57.656 [2024-07-23 05:08:28.549892] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:57.656 [2024-07-23 05:08:28.549917] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:57.656 #14 NEW cov: 10764 ft: 15865 corp: 7/723b lim: 320 exec/s: 14 rss: 69Mb L: 95/167 MS: 1 EraseBytes- 00:08:57.656 [2024-07-23 05:08:28.694642] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:57.656 [2024-07-23 05:08:28.694675] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:57.657 [2024-07-23 05:08:28.694690] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:57.657 [2024-07-23 05:08:28.694715] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:57.657 [2024-07-23 05:08:28.695659] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:57.657 [2024-07-23 05:08:28.695685] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:57.657 [2024-07-23 05:08:28.695710] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:57.915 #15 NEW cov: 10764 ft: 16015 corp: 8/887b lim: 320 exec/s: 15 rss: 69Mb L: 164/167 MS: 1 CopyPart- 00:08:57.915 [2024-07-23 05:08:28.841069] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:57.915 [2024-07-23 05:08:28.841101] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:57.915 [2024-07-23 05:08:28.841116] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:57.915 [2024-07-23 05:08:28.841140] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:57.915 [2024-07-23 05:08:28.842082] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:57.915 [2024-07-23 05:08:28.842107] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:57.915 [2024-07-23 05:08:28.842132] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:57.915 #16 NEW cov: 10764 ft: 16488 corp: 9/1054b lim: 320 exec/s: 16 rss: 69Mb L: 167/167 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:57.915 [2024-07-23 05:08:28.996390] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:57.915 [2024-07-23 05:08:28.996423] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:57.915 [2024-07-23 05:08:28.996449] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:57.915 [2024-07-23 05:08:28.996475] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:57.915 [2024-07-23 05:08:28.997392] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:57.915 [2024-07-23 05:08:28.997416] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:57.915 [2024-07-23 05:08:28.997446] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:58.174 #17 NEW cov: 10771 ft: 16760 corp: 10/1219b lim: 320 exec/s: 17 rss: 69Mb L: 165/167 MS: 1 InsertByte- 00:08:58.174 [2024-07-23 05:08:29.143019] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:58.174 [2024-07-23 05:08:29.143051] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:58.174 [2024-07-23 05:08:29.143069] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:58.174 [2024-07-23 05:08:29.143096] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:58.174 [2024-07-23 05:08:29.144062] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:58.174 [2024-07-23 05:08:29.144087] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:58.174 [2024-07-23 05:08:29.144112] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:58.174 #18 NEW cov: 10771 ft: 16863 corp: 11/1354b lim: 320 exec/s: 18 rss: 69Mb L: 135/167 MS: 1 EraseBytes- 00:08:58.433 [2024-07-23 05:08:29.298419] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:58.433 [2024-07-23 05:08:29.298464] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:58.433 [2024-07-23 05:08:29.298484] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:58.433 [2024-07-23 05:08:29.298509] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:58.433 [2024-07-23 05:08:29.299423] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:58.433 [2024-07-23 05:08:29.299454] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:58.433 [2024-07-23 05:08:29.299479] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:58.433 #19 NEW cov: 10771 ft: 16906 corp: 12/1489b lim: 320 exec/s: 9 rss: 69Mb L: 135/167 MS: 1 ChangeBit- 00:08:58.433 #19 DONE cov: 10771 ft: 16906 corp: 12/1489b lim: 320 exec/s: 9 rss: 69Mb 00:08:58.433 ###### Recommended dictionary. ###### 00:08:58.433 "\001\000\000\000" # Uses: 0 00:08:58.433 ###### End of recommended dictionary. ###### 00:08:58.433 Done 19 runs in 2 second(s) 00:08:58.692 05:08:29 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:58.692 05:08:29 -- ../common.sh@72 -- # (( i++ )) 00:08:58.692 05:08:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.692 05:08:29 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:58.692 05:08:29 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:58.692 05:08:29 -- vfio/run.sh@23 -- # local timen=1 00:08:58.692 05:08:29 -- vfio/run.sh@24 -- # local core=0x1 00:08:58.692 05:08:29 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:58.692 05:08:29 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:58.692 05:08:29 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:58.692 05:08:29 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:58.692 05:08:29 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:58.692 05:08:29 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:58.692 05:08:29 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:58.692 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:58.692 05:08:29 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:58.692 [2024-07-23 05:08:29.732130] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:58.692 [2024-07-23 05:08:29.732199] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3152995 ] 00:08:58.692 EAL: No free 2048 kB hugepages reported on node 1 00:08:58.951 [2024-07-23 05:08:29.836017] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.951 [2024-07-23 05:08:29.918593] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:58.951 [2024-07-23 05:08:29.918776] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:59.210 INFO: Running with entropic power schedule (0xFF, 100). 00:08:59.210 INFO: Seed: 595434876 00:08:59.210 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:59.210 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:59.210 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:59.210 INFO: A corpus is not provided, starting from an empty corpus 00:08:59.210 #2 INITED exec/s: 0 rss: 61Mb 00:08:59.210 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:59.210 This may also happen if the target rejected all inputs we tried so far 00:08:59.210 [2024-07-23 05:08:30.246750] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.210 [2024-07-23 05:08:30.246807] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.727 NEW_FUNC[1/638]: 0x482fd0 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:59.727 NEW_FUNC[2/638]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:59.727 #6 NEW cov: 10733 ft: 10684 corp: 2/121b lim: 120 exec/s: 0 rss: 66Mb L: 120/120 MS: 4 ShuffleBytes-CopyPart-InsertByte-InsertRepeatedBytes- 00:08:59.985 [2024-07-23 05:08:30.874754] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.985 [2024-07-23 05:08:30.874809] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.985 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:59.985 #7 NEW cov: 10764 ft: 12991 corp: 3/241b lim: 120 exec/s: 0 rss: 68Mb L: 120/120 MS: 1 ChangeBinInt- 00:09:00.244 [2024-07-23 05:08:31.112799] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:00.244 [2024-07-23 05:08:31.112839] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:00.244 #16 NEW cov: 10764 ft: 13649 corp: 4/355b lim: 120 exec/s: 16 rss: 69Mb L: 114/120 MS: 4 CopyPart-ChangeByte-ChangeBit-CrossOver- 00:09:00.503 [2024-07-23 05:08:31.365037] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:00.503 [2024-07-23 05:08:31.365076] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:00.503 #17 NEW cov: 10764 ft: 13951 corp: 5/470b lim: 120 exec/s: 17 rss: 69Mb L: 115/120 MS: 1 CrossOver- 00:09:00.763 [2024-07-23 05:08:31.604216] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:00.763 [2024-07-23 05:08:31.604256] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:00.763 #18 NEW cov: 10764 ft: 14386 corp: 6/584b lim: 120 exec/s: 18 rss: 69Mb L: 114/120 MS: 1 CopyPart- 00:09:00.763 [2024-07-23 05:08:31.841222] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:00.763 [2024-07-23 05:08:31.841261] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:01.022 #22 NEW cov: 10771 ft: 14438 corp: 7/624b lim: 120 exec/s: 22 rss: 69Mb L: 40/120 MS: 4 CrossOver-ChangeBinInt-InsertByte-InsertRepeatedBytes- 00:09:01.022 [2024-07-23 05:08:32.082780] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:01.022 [2024-07-23 05:08:32.082818] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:01.281 #23 NEW cov: 10771 ft: 14538 corp: 8/744b lim: 120 exec/s: 11 rss: 69Mb L: 120/120 MS: 1 CrossOver- 00:09:01.281 #23 DONE cov: 10771 ft: 14538 corp: 8/744b lim: 120 exec/s: 11 rss: 69Mb 00:09:01.281 Done 23 runs in 2 second(s) 00:09:01.540 05:08:32 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:09:01.540 05:08:32 -- ../common.sh@72 -- # (( i++ )) 00:09:01.540 05:08:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:01.540 05:08:32 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:01.540 05:08:32 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:01.540 05:08:32 -- vfio/run.sh@23 -- # local timen=1 00:09:01.540 05:08:32 -- vfio/run.sh@24 -- # local core=0x1 00:09:01.540 05:08:32 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:01.540 05:08:32 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:01.540 05:08:32 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:01.540 05:08:32 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:01.540 05:08:32 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:01.540 05:08:32 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:01.540 05:08:32 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:01.540 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:01.540 05:08:32 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:01.540 [2024-07-23 05:08:32.558875] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:01.540 [2024-07-23 05:08:32.558953] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3153427 ] 00:09:01.540 EAL: No free 2048 kB hugepages reported on node 1 00:09:01.800 [2024-07-23 05:08:32.663595] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.800 [2024-07-23 05:08:32.745975] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:01.800 [2024-07-23 05:08:32.746161] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:02.059 INFO: Running with entropic power schedule (0xFF, 100). 00:09:02.059 INFO: Seed: 3427437698 00:09:02.059 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:09:02.059 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:09:02.059 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:02.059 INFO: A corpus is not provided, starting from an empty corpus 00:09:02.059 #2 INITED exec/s: 0 rss: 61Mb 00:09:02.059 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:02.059 This may also happen if the target rejected all inputs we tried so far 00:09:02.059 [2024-07-23 05:08:33.051503] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:02.059 [2024-07-23 05:08:33.051556] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:02.576 NEW_FUNC[1/638]: 0x483cc0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:02.576 NEW_FUNC[2/638]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:02.576 #3 NEW cov: 10720 ft: 10688 corp: 2/50b lim: 90 exec/s: 0 rss: 66Mb L: 49/49 MS: 1 InsertRepeatedBytes- 00:09:02.576 [2024-07-23 05:08:33.658792] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:02.576 [2024-07-23 05:08:33.658844] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:02.834 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:02.835 #4 NEW cov: 10756 ft: 13591 corp: 3/99b lim: 90 exec/s: 0 rss: 68Mb L: 49/49 MS: 1 CopyPart- 00:09:02.835 [2024-07-23 05:08:33.878058] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:02.835 [2024-07-23 05:08:33.878098] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:03.093 #5 NEW cov: 10756 ft: 13895 corp: 4/181b lim: 90 exec/s: 5 rss: 69Mb L: 82/82 MS: 1 InsertRepeatedBytes- 00:09:03.093 [2024-07-23 05:08:34.095385] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:03.093 [2024-07-23 05:08:34.095424] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:03.352 #11 NEW cov: 10756 ft: 14114 corp: 5/213b lim: 90 exec/s: 11 rss: 69Mb L: 32/82 MS: 1 EraseBytes- 00:09:03.353 [2024-07-23 05:08:34.303574] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:03.353 [2024-07-23 05:08:34.303613] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:03.353 #12 NEW cov: 10756 ft: 14771 corp: 6/272b lim: 90 exec/s: 12 rss: 69Mb L: 59/82 MS: 1 CopyPart- 00:09:03.612 [2024-07-23 05:08:34.522115] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:03.612 [2024-07-23 05:08:34.522154] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:03.612 #13 NEW cov: 10756 ft: 14826 corp: 7/306b lim: 90 exec/s: 13 rss: 69Mb L: 34/82 MS: 1 CMP- DE: "\001\010"- 00:09:03.880 [2024-07-23 05:08:34.727318] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:03.880 [2024-07-23 05:08:34.727358] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:03.880 #14 NEW cov: 10763 ft: 15431 corp: 8/338b lim: 90 exec/s: 14 rss: 69Mb L: 32/82 MS: 1 ChangeBit- 00:09:03.880 [2024-07-23 05:08:34.945346] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:03.880 [2024-07-23 05:08:34.945384] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:04.158 #15 NEW cov: 10763 ft: 16041 corp: 9/404b lim: 90 exec/s: 7 rss: 69Mb L: 66/82 MS: 1 EraseBytes- 00:09:04.158 #15 DONE cov: 10763 ft: 16041 corp: 9/404b lim: 90 exec/s: 7 rss: 69Mb 00:09:04.158 ###### Recommended dictionary. ###### 00:09:04.158 "\001\010" # Uses: 0 00:09:04.158 ###### End of recommended dictionary. ###### 00:09:04.158 Done 15 runs in 2 second(s) 00:09:04.417 05:08:35 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:09:04.417 05:08:35 -- ../common.sh@72 -- # (( i++ )) 00:09:04.417 05:08:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:04.417 05:08:35 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:09:04.417 00:09:04.417 real 0m20.066s 00:09:04.417 user 0m25.995s 00:09:04.417 sys 0m2.102s 00:09:04.417 05:08:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.417 05:08:35 -- common/autotest_common.sh@10 -- # set +x 00:09:04.417 ************************************ 00:09:04.417 END TEST vfio_fuzz 00:09:04.417 ************************************ 00:09:04.417 05:08:35 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:09:04.417 00:09:04.417 real 1m26.003s 00:09:04.417 user 2m2.506s 00:09:04.417 sys 0m10.159s 00:09:04.417 05:08:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.417 05:08:35 -- common/autotest_common.sh@10 -- # set +x 00:09:04.417 ************************************ 00:09:04.417 END TEST llvm_fuzz 00:09:04.417 ************************************ 00:09:04.417 05:08:35 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:09:04.417 05:08:35 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:09:04.417 05:08:35 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:09:04.417 05:08:35 -- common/autotest_common.sh@712 -- # xtrace_disable 00:09:04.417 05:08:35 -- common/autotest_common.sh@10 -- # set +x 00:09:04.417 05:08:35 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:09:04.417 05:08:35 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:09:04.417 05:08:35 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:09:04.417 05:08:35 -- common/autotest_common.sh@10 -- # set +x 00:09:10.993 INFO: APP EXITING 00:09:10.993 INFO: killing all VMs 00:09:10.993 INFO: killing vhost app 00:09:10.993 WARN: no vhost pid file found 00:09:10.993 INFO: EXIT DONE 00:09:13.551 Waiting for block devices as requested 00:09:13.551 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:13.811 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:13.811 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:13.811 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:14.070 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:14.070 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:14.070 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:14.330 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:14.330 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:14.330 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:14.590 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:14.590 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:14.590 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:14.849 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:14.849 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:14.849 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:15.109 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:18.403 Cleaning 00:09:18.403 Removing: /dev/shm/spdk_tgt_trace.pid3114539 00:09:18.403 Removing: /var/run/dpdk/spdk_pid3112066 00:09:18.403 Removing: /var/run/dpdk/spdk_pid3113337 00:09:18.403 Removing: /var/run/dpdk/spdk_pid3114539 00:09:18.403 Removing: /var/run/dpdk/spdk_pid3115313 00:09:18.403 Removing: /var/run/dpdk/spdk_pid3115629 00:09:18.403 Removing: /var/run/dpdk/spdk_pid3115957 00:09:18.403 Removing: /var/run/dpdk/spdk_pid3116349 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3116744 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3116927 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3117199 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3117515 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3118365 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3121654 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3122147 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3122447 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3122577 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3123256 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3123309 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3123883 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3124148 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3124457 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3124661 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3124776 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3125075 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3125653 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3125981 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3126493 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3126858 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3127163 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3127182 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3127353 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3127543 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3127807 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3128073 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3128364 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3128633 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3128916 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3129188 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3129471 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3129742 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3130031 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3130219 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3130450 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3130639 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3130898 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3131170 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3131454 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3131721 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3132013 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3132283 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3132570 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3132826 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3133050 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3133256 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3133459 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3133702 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3133989 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3134262 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3134549 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3134821 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3135102 00:09:18.663 Removing: /var/run/dpdk/spdk_pid3135381 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3135662 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3135915 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3136154 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3136369 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3136607 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3136815 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3137100 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3137370 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3137655 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3137857 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3138051 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3138767 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3139195 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3139614 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3140151 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3140628 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3140993 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3141532 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3142063 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3142382 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3142910 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3143447 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3143848 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3144284 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3144827 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3145285 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3145656 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3146198 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3146741 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3147068 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3147575 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3148119 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3148466 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3148950 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3149494 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3149916 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3150436 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3150948 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3151486 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3152036 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3152573 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3152995 00:09:18.923 Removing: /var/run/dpdk/spdk_pid3153427 00:09:18.923 Clean 00:09:19.182 killing process with pid 3066818 00:09:22.475 killing process with pid 3066815 00:09:22.734 killing process with pid 3066817 00:09:22.734 killing process with pid 3066816 00:09:22.734 05:08:53 -- common/autotest_common.sh@1436 -- # return 0 00:09:22.734 05:08:53 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:09:22.734 05:08:53 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:22.734 05:08:53 -- common/autotest_common.sh@10 -- # set +x 00:09:22.734 05:08:53 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:09:22.734 05:08:53 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:22.734 05:08:53 -- common/autotest_common.sh@10 -- # set +x 00:09:22.734 05:08:53 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:22.734 05:08:53 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:22.734 05:08:53 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:22.734 05:08:53 -- spdk/autotest.sh@394 -- # hash lcov 00:09:22.734 05:08:53 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:09:22.994 05:08:53 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:22.994 05:08:53 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:22.994 05:08:53 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:22.994 05:08:53 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:22.994 05:08:53 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:22.994 05:08:53 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:22.994 05:08:53 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:22.994 05:08:53 -- paths/export.sh@5 -- $ export PATH 00:09:22.994 05:08:53 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:22.994 05:08:53 -- common/autobuild_common.sh@437 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:22.994 05:08:53 -- common/autobuild_common.sh@438 -- $ date +%s 00:09:22.994 05:08:53 -- common/autobuild_common.sh@438 -- $ mktemp -dt spdk_1721704133.XXXXXX 00:09:22.994 05:08:53 -- common/autobuild_common.sh@438 -- $ SPDK_WORKSPACE=/tmp/spdk_1721704133.aqb0gh 00:09:22.994 05:08:53 -- common/autobuild_common.sh@440 -- $ [[ -n '' ]] 00:09:22.994 05:08:53 -- common/autobuild_common.sh@444 -- $ '[' -n '' ']' 00:09:22.994 05:08:53 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:09:22.994 05:08:53 -- common/autobuild_common.sh@451 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:22.994 05:08:53 -- common/autobuild_common.sh@453 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:22.994 05:08:53 -- common/autobuild_common.sh@454 -- $ get_config_params 00:09:22.994 05:08:53 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:09:22.994 05:08:53 -- common/autotest_common.sh@10 -- $ set +x 00:09:22.994 05:08:53 -- common/autobuild_common.sh@454 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:09:22.994 05:08:53 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:22.994 05:08:53 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:22.994 05:08:53 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:22.994 05:08:53 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:22.994 05:08:53 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:22.994 05:08:53 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:22.994 05:08:53 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:22.994 05:08:53 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:22.994 05:08:53 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:22.994 05:08:53 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:22.994 + [[ -n 3023437 ]] 00:09:22.994 + sudo kill 3023437 00:09:23.004 [Pipeline] } 00:09:23.023 [Pipeline] // stage 00:09:23.029 [Pipeline] } 00:09:23.047 [Pipeline] // timeout 00:09:23.053 [Pipeline] } 00:09:23.070 [Pipeline] // catchError 00:09:23.075 [Pipeline] } 00:09:23.093 [Pipeline] // wrap 00:09:23.099 [Pipeline] } 00:09:23.115 [Pipeline] // catchError 00:09:23.124 [Pipeline] stage 00:09:23.127 [Pipeline] { (Epilogue) 00:09:23.141 [Pipeline] catchError 00:09:23.143 [Pipeline] { 00:09:23.157 [Pipeline] echo 00:09:23.158 Cleanup processes 00:09:23.164 [Pipeline] sh 00:09:23.450 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:23.450 3162523 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:23.488 [Pipeline] sh 00:09:23.797 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:23.797 ++ grep -v 'sudo pgrep' 00:09:23.797 ++ awk '{print $1}' 00:09:23.797 + sudo kill -9 00:09:23.797 + true 00:09:23.808 [Pipeline] sh 00:09:24.092 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:24.092 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:24.092 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:26.011 [Pipeline] sh 00:09:26.295 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:26.295 Artifacts sizes are good 00:09:26.310 [Pipeline] archiveArtifacts 00:09:26.317 Archiving artifacts 00:09:26.372 [Pipeline] sh 00:09:26.657 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:26.672 [Pipeline] cleanWs 00:09:26.681 [WS-CLEANUP] Deleting project workspace... 00:09:26.681 [WS-CLEANUP] Deferred wipeout is used... 00:09:26.688 [WS-CLEANUP] done 00:09:26.691 [Pipeline] } 00:09:26.711 [Pipeline] // catchError 00:09:26.722 [Pipeline] sh 00:09:27.020 + logger -p user.info -t JENKINS-CI 00:09:27.030 [Pipeline] } 00:09:27.045 [Pipeline] // stage 00:09:27.051 [Pipeline] } 00:09:27.067 [Pipeline] // node 00:09:27.073 [Pipeline] End of Pipeline 00:09:27.100 Finished: SUCCESS