00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v22.11" build number 1010 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3677 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.021 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.022 The recommended git tool is: git 00:00:00.023 using credential 00000000-0000-0000-0000-000000000002 00:00:00.025 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.041 Fetching changes from the remote Git repository 00:00:00.076 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.094 Using shallow fetch with depth 1 00:00:00.094 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.094 > git --version # timeout=10 00:00:00.115 > git --version # 'git version 2.39.2' 00:00:00.115 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.136 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.136 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.186 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.196 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.207 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:03.207 > git config core.sparsecheckout # timeout=10 00:00:03.217 > git read-tree -mu HEAD # timeout=10 00:00:03.232 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:03.257 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:03.257 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:03.357 [Pipeline] Start of Pipeline 00:00:03.369 [Pipeline] library 00:00:03.371 Loading library shm_lib@master 00:00:03.371 Library shm_lib@master is cached. Copying from home. 00:00:03.387 [Pipeline] node 00:00:03.403 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.405 [Pipeline] { 00:00:03.416 [Pipeline] catchError 00:00:03.417 [Pipeline] { 00:00:03.430 [Pipeline] wrap 00:00:03.439 [Pipeline] { 00:00:03.447 [Pipeline] stage 00:00:03.448 [Pipeline] { (Prologue) 00:00:03.688 [Pipeline] sh 00:00:03.978 + logger -p user.info -t JENKINS-CI 00:00:04.003 [Pipeline] echo 00:00:04.004 Node: WFP20 00:00:04.010 [Pipeline] sh 00:00:04.302 [Pipeline] setCustomBuildProperty 00:00:04.314 [Pipeline] echo 00:00:04.316 Cleanup processes 00:00:04.321 [Pipeline] sh 00:00:04.603 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.603 2083243 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.614 [Pipeline] sh 00:00:04.894 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.894 ++ grep -v 'sudo pgrep' 00:00:04.894 ++ awk '{print $1}' 00:00:04.894 + sudo kill -9 00:00:04.894 + true 00:00:04.909 [Pipeline] cleanWs 00:00:04.919 [WS-CLEANUP] Deleting project workspace... 00:00:04.919 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.924 [WS-CLEANUP] done 00:00:04.930 [Pipeline] setCustomBuildProperty 00:00:04.946 [Pipeline] sh 00:00:05.222 + sudo git config --global --replace-all safe.directory '*' 00:00:05.301 [Pipeline] httpRequest 00:00:06.386 [Pipeline] echo 00:00:06.387 Sorcerer 10.211.164.20 is alive 00:00:06.394 [Pipeline] retry 00:00:06.395 [Pipeline] { 00:00:06.404 [Pipeline] httpRequest 00:00:06.407 HttpMethod: GET 00:00:06.408 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.408 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.425 Response Code: HTTP/1.1 200 OK 00:00:06.426 Success: Status code 200 is in the accepted range: 200,404 00:00:06.426 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:10.698 [Pipeline] } 00:00:10.716 [Pipeline] // retry 00:00:10.724 [Pipeline] sh 00:00:11.010 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:11.027 [Pipeline] httpRequest 00:00:11.426 [Pipeline] echo 00:00:11.428 Sorcerer 10.211.164.20 is alive 00:00:11.438 [Pipeline] retry 00:00:11.440 [Pipeline] { 00:00:11.455 [Pipeline] httpRequest 00:00:11.460 HttpMethod: GET 00:00:11.460 URL: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:11.461 Sending request to url: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:11.474 Response Code: HTTP/1.1 200 OK 00:00:11.475 Success: Status code 200 is in the accepted range: 200,404 00:00:11.475 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:00.073 [Pipeline] } 00:01:00.090 [Pipeline] // retry 00:01:00.101 [Pipeline] sh 00:01:00.416 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:02.968 [Pipeline] sh 00:01:03.254 + git -C spdk log --oneline -n5 00:01:03.254 c13c99a5e test: Various fixes for Fedora40 00:01:03.254 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:01:03.254 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:01:03.254 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:01:03.254 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:01:03.273 [Pipeline] withCredentials 00:01:03.286 > git --version # timeout=10 00:01:03.300 > git --version # 'git version 2.39.2' 00:01:03.318 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:03.320 [Pipeline] { 00:01:03.329 [Pipeline] retry 00:01:03.331 [Pipeline] { 00:01:03.347 [Pipeline] sh 00:01:03.632 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:01:03.907 [Pipeline] } 00:01:03.925 [Pipeline] // retry 00:01:03.930 [Pipeline] } 00:01:03.947 [Pipeline] // withCredentials 00:01:03.958 [Pipeline] httpRequest 00:01:04.337 [Pipeline] echo 00:01:04.339 Sorcerer 10.211.164.20 is alive 00:01:04.348 [Pipeline] retry 00:01:04.350 [Pipeline] { 00:01:04.362 [Pipeline] httpRequest 00:01:04.367 HttpMethod: GET 00:01:04.367 URL: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:04.368 Sending request to url: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:04.378 Response Code: HTTP/1.1 200 OK 00:01:04.379 Success: Status code 200 is in the accepted range: 200,404 00:01:04.379 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:19.770 [Pipeline] } 00:01:19.789 [Pipeline] // retry 00:01:19.797 [Pipeline] sh 00:01:20.085 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:21.531 [Pipeline] sh 00:01:21.814 + git -C dpdk log --oneline -n5 00:01:21.814 caf0f5d395 version: 22.11.4 00:01:21.814 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:21.814 dc9c799c7d vhost: fix missing spinlock unlock 00:01:21.814 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:21.814 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:21.823 [Pipeline] } 00:01:21.837 [Pipeline] // stage 00:01:21.847 [Pipeline] stage 00:01:21.849 [Pipeline] { (Prepare) 00:01:21.869 [Pipeline] writeFile 00:01:21.883 [Pipeline] sh 00:01:22.162 + logger -p user.info -t JENKINS-CI 00:01:22.173 [Pipeline] sh 00:01:22.455 + logger -p user.info -t JENKINS-CI 00:01:22.466 [Pipeline] sh 00:01:22.748 + cat autorun-spdk.conf 00:01:22.748 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:22.748 SPDK_RUN_UBSAN=1 00:01:22.748 SPDK_TEST_FUZZER=1 00:01:22.748 SPDK_TEST_FUZZER_SHORT=1 00:01:22.748 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:22.748 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:22.755 RUN_NIGHTLY=1 00:01:22.759 [Pipeline] readFile 00:01:22.784 [Pipeline] withEnv 00:01:22.787 [Pipeline] { 00:01:22.799 [Pipeline] sh 00:01:23.082 + set -ex 00:01:23.082 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:23.082 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:23.082 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:23.082 ++ SPDK_RUN_UBSAN=1 00:01:23.082 ++ SPDK_TEST_FUZZER=1 00:01:23.082 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:23.082 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:23.082 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:23.082 ++ RUN_NIGHTLY=1 00:01:23.082 + case $SPDK_TEST_NVMF_NICS in 00:01:23.082 + DRIVERS= 00:01:23.082 + [[ -n '' ]] 00:01:23.082 + exit 0 00:01:23.090 [Pipeline] } 00:01:23.105 [Pipeline] // withEnv 00:01:23.110 [Pipeline] } 00:01:23.124 [Pipeline] // stage 00:01:23.134 [Pipeline] catchError 00:01:23.136 [Pipeline] { 00:01:23.150 [Pipeline] timeout 00:01:23.150 Timeout set to expire in 30 min 00:01:23.152 [Pipeline] { 00:01:23.166 [Pipeline] stage 00:01:23.168 [Pipeline] { (Tests) 00:01:23.182 [Pipeline] sh 00:01:23.464 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:23.464 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:23.464 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:23.464 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:23.464 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:23.464 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:23.464 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:23.464 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:23.464 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:23.464 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:23.464 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:23.464 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:23.464 + source /etc/os-release 00:01:23.464 ++ NAME='Fedora Linux' 00:01:23.464 ++ VERSION='39 (Cloud Edition)' 00:01:23.464 ++ ID=fedora 00:01:23.464 ++ VERSION_ID=39 00:01:23.464 ++ VERSION_CODENAME= 00:01:23.464 ++ PLATFORM_ID=platform:f39 00:01:23.465 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:23.465 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:23.465 ++ LOGO=fedora-logo-icon 00:01:23.465 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:23.465 ++ HOME_URL=https://fedoraproject.org/ 00:01:23.465 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:23.465 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:23.465 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:23.465 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:23.465 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:23.465 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:23.465 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:23.465 ++ SUPPORT_END=2024-11-12 00:01:23.465 ++ VARIANT='Cloud Edition' 00:01:23.465 ++ VARIANT_ID=cloud 00:01:23.465 + uname -a 00:01:23.465 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:23.465 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:26.749 Hugepages 00:01:26.749 node hugesize free / total 00:01:26.749 node0 1048576kB 0 / 0 00:01:26.749 node0 2048kB 0 / 0 00:01:26.749 node1 1048576kB 0 / 0 00:01:26.749 node1 2048kB 0 / 0 00:01:26.749 00:01:26.749 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:26.749 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:26.749 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:26.749 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:26.749 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:26.749 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:26.749 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:26.749 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:26.749 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:26.749 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:26.749 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:26.749 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:26.749 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:26.749 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:26.749 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:26.749 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:26.749 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:26.749 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:26.749 + rm -f /tmp/spdk-ld-path 00:01:26.749 + source autorun-spdk.conf 00:01:26.749 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:26.749 ++ SPDK_RUN_UBSAN=1 00:01:26.749 ++ SPDK_TEST_FUZZER=1 00:01:26.749 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:26.749 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:26.749 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:26.749 ++ RUN_NIGHTLY=1 00:01:26.749 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:26.749 + [[ -n '' ]] 00:01:26.749 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:26.749 + for M in /var/spdk/build-*-manifest.txt 00:01:26.749 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:26.749 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:26.749 + for M in /var/spdk/build-*-manifest.txt 00:01:26.749 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:26.749 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:26.749 + for M in /var/spdk/build-*-manifest.txt 00:01:26.749 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:26.749 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:26.749 ++ uname 00:01:26.749 + [[ Linux == \L\i\n\u\x ]] 00:01:26.749 + sudo dmesg -T 00:01:26.750 + sudo dmesg --clear 00:01:26.750 + dmesg_pid=2084168 00:01:26.750 + [[ Fedora Linux == FreeBSD ]] 00:01:26.750 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:26.750 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:26.750 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:26.750 + [[ -x /usr/src/fio-static/fio ]] 00:01:26.750 + export FIO_BIN=/usr/src/fio-static/fio 00:01:26.750 + FIO_BIN=/usr/src/fio-static/fio 00:01:26.750 + sudo dmesg -Tw 00:01:26.750 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:26.750 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:26.750 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:26.750 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:26.750 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:26.750 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:26.750 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:26.750 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:26.750 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:26.750 Test configuration: 00:01:26.750 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:26.750 SPDK_RUN_UBSAN=1 00:01:26.750 SPDK_TEST_FUZZER=1 00:01:26.750 SPDK_TEST_FUZZER_SHORT=1 00:01:26.750 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:26.750 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:26.750 RUN_NIGHTLY=1 05:26:37 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:01:26.750 05:26:37 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:26.750 05:26:37 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:26.750 05:26:37 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:26.750 05:26:37 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:26.750 05:26:37 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:26.750 05:26:37 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:26.750 05:26:37 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:26.750 05:26:37 -- paths/export.sh@5 -- $ export PATH 00:01:26.750 05:26:37 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:26.750 05:26:37 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:26.750 05:26:37 -- common/autobuild_common.sh@440 -- $ date +%s 00:01:26.750 05:26:37 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732854397.XXXXXX 00:01:26.750 05:26:37 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732854397.sAs4cI 00:01:26.750 05:26:37 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:01:26.750 05:26:37 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:01:26.750 05:26:37 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:26.750 05:26:37 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:26.750 05:26:37 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:26.750 05:26:37 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:26.750 05:26:37 -- common/autobuild_common.sh@456 -- $ get_config_params 00:01:26.750 05:26:37 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:01:26.750 05:26:37 -- common/autotest_common.sh@10 -- $ set +x 00:01:26.750 05:26:37 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:26.750 05:26:37 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:26.750 05:26:37 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:26.750 05:26:37 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:26.750 05:26:37 -- spdk/autobuild.sh@16 -- $ date -u 00:01:26.750 Fri Nov 29 04:26:37 AM UTC 2024 00:01:26.750 05:26:37 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:26.750 LTS-67-gc13c99a5e 00:01:26.750 05:26:37 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:26.750 05:26:37 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:26.750 05:26:37 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:26.750 05:26:37 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:26.750 05:26:37 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:26.750 05:26:37 -- common/autotest_common.sh@10 -- $ set +x 00:01:26.750 ************************************ 00:01:26.750 START TEST ubsan 00:01:26.750 ************************************ 00:01:26.750 05:26:37 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:01:26.750 using ubsan 00:01:26.750 00:01:26.750 real 0m0.001s 00:01:26.750 user 0m0.000s 00:01:26.750 sys 0m0.000s 00:01:26.750 05:26:37 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:26.750 05:26:37 -- common/autotest_common.sh@10 -- $ set +x 00:01:26.750 ************************************ 00:01:26.750 END TEST ubsan 00:01:26.750 ************************************ 00:01:26.750 05:26:38 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:01:26.750 05:26:38 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:26.750 05:26:38 -- common/autobuild_common.sh@432 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:26.750 05:26:38 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:26.750 05:26:38 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:26.750 05:26:38 -- common/autotest_common.sh@10 -- $ set +x 00:01:26.750 ************************************ 00:01:26.750 START TEST build_native_dpdk 00:01:26.750 ************************************ 00:01:26.750 05:26:38 -- common/autotest_common.sh@1114 -- $ _build_native_dpdk 00:01:26.750 05:26:38 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:26.750 05:26:38 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:26.750 05:26:38 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:26.750 05:26:38 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:26.750 05:26:38 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:26.750 05:26:38 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:26.750 05:26:38 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:26.750 05:26:38 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:26.750 05:26:38 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:26.750 05:26:38 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:26.750 05:26:38 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:26.750 05:26:38 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:26.750 05:26:38 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:26.750 05:26:38 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:26.750 05:26:38 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:26.750 05:26:38 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:27.009 05:26:38 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:27.009 05:26:38 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:27.009 05:26:38 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:27.009 05:26:38 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:27.009 caf0f5d395 version: 22.11.4 00:01:27.009 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:27.009 dc9c799c7d vhost: fix missing spinlock unlock 00:01:27.009 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:27.009 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:27.009 05:26:38 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:27.009 05:26:38 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:27.009 05:26:38 -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:01:27.009 05:26:38 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:27.009 05:26:38 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:27.009 05:26:38 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:27.009 05:26:38 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:27.009 05:26:38 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:27.009 05:26:38 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:27.009 05:26:38 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:27.009 05:26:38 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:27.009 05:26:38 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:27.009 05:26:38 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:27.009 05:26:38 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:27.009 05:26:38 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:27.009 05:26:38 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:27.009 05:26:38 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:27.009 05:26:38 -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:01:27.009 05:26:38 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:01:27.009 05:26:38 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:27.009 05:26:38 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:27.009 05:26:38 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:27.009 05:26:38 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:27.009 05:26:38 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:27.009 05:26:38 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:27.009 05:26:38 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:27.009 05:26:38 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:27.009 05:26:38 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:27.009 05:26:38 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:27.009 05:26:38 -- scripts/common.sh@343 -- $ case "$op" in 00:01:27.009 05:26:38 -- scripts/common.sh@344 -- $ : 1 00:01:27.009 05:26:38 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:27.009 05:26:38 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:27.009 05:26:38 -- scripts/common.sh@364 -- $ decimal 22 00:01:27.009 05:26:38 -- scripts/common.sh@352 -- $ local d=22 00:01:27.009 05:26:38 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:27.009 05:26:38 -- scripts/common.sh@354 -- $ echo 22 00:01:27.009 05:26:38 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:27.009 05:26:38 -- scripts/common.sh@365 -- $ decimal 21 00:01:27.009 05:26:38 -- scripts/common.sh@352 -- $ local d=21 00:01:27.009 05:26:38 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:27.009 05:26:38 -- scripts/common.sh@354 -- $ echo 21 00:01:27.009 05:26:38 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:27.009 05:26:38 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:27.009 05:26:38 -- scripts/common.sh@366 -- $ return 1 00:01:27.009 05:26:38 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:27.009 patching file config/rte_config.h 00:01:27.009 Hunk #1 succeeded at 60 (offset 1 line). 00:01:27.009 05:26:38 -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:01:27.009 05:26:38 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:01:27.009 05:26:38 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:27.009 05:26:38 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:27.009 05:26:38 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:27.009 05:26:38 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:27.009 05:26:38 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:27.009 05:26:38 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:27.009 05:26:38 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:27.009 05:26:38 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:27.009 05:26:38 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:27.009 05:26:38 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:27.009 05:26:38 -- scripts/common.sh@343 -- $ case "$op" in 00:01:27.009 05:26:38 -- scripts/common.sh@344 -- $ : 1 00:01:27.009 05:26:38 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:27.009 05:26:38 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:27.009 05:26:38 -- scripts/common.sh@364 -- $ decimal 22 00:01:27.009 05:26:38 -- scripts/common.sh@352 -- $ local d=22 00:01:27.009 05:26:38 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:27.009 05:26:38 -- scripts/common.sh@354 -- $ echo 22 00:01:27.009 05:26:38 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:27.009 05:26:38 -- scripts/common.sh@365 -- $ decimal 24 00:01:27.009 05:26:38 -- scripts/common.sh@352 -- $ local d=24 00:01:27.009 05:26:38 -- scripts/common.sh@353 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:27.009 05:26:38 -- scripts/common.sh@354 -- $ echo 24 00:01:27.009 05:26:38 -- scripts/common.sh@365 -- $ ver2[v]=24 00:01:27.009 05:26:38 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:27.009 05:26:38 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:01:27.009 05:26:38 -- scripts/common.sh@367 -- $ return 0 00:01:27.009 05:26:38 -- common/autobuild_common.sh@177 -- $ patch -p1 00:01:27.009 patching file lib/pcapng/rte_pcapng.c 00:01:27.009 Hunk #1 succeeded at 110 (offset -18 lines). 00:01:27.009 05:26:38 -- common/autobuild_common.sh@180 -- $ dpdk_kmods=false 00:01:27.009 05:26:38 -- common/autobuild_common.sh@181 -- $ uname -s 00:01:27.009 05:26:38 -- common/autobuild_common.sh@181 -- $ '[' Linux = FreeBSD ']' 00:01:27.009 05:26:38 -- common/autobuild_common.sh@185 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:27.009 05:26:38 -- common/autobuild_common.sh@185 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:32.275 The Meson build system 00:01:32.275 Version: 1.5.0 00:01:32.275 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:32.275 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:32.275 Build type: native build 00:01:32.275 Program cat found: YES (/usr/bin/cat) 00:01:32.275 Project name: DPDK 00:01:32.275 Project version: 22.11.4 00:01:32.275 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:01:32.275 C linker for the host machine: gcc ld.bfd 2.40-14 00:01:32.275 Host machine cpu family: x86_64 00:01:32.275 Host machine cpu: x86_64 00:01:32.275 Message: ## Building in Developer Mode ## 00:01:32.275 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:32.275 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:32.275 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:32.275 Program objdump found: YES (/usr/bin/objdump) 00:01:32.275 Program python3 found: YES (/usr/bin/python3) 00:01:32.275 Program cat found: YES (/usr/bin/cat) 00:01:32.275 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:32.275 Checking for size of "void *" : 8 00:01:32.275 Checking for size of "void *" : 8 (cached) 00:01:32.275 Library m found: YES 00:01:32.275 Library numa found: YES 00:01:32.275 Has header "numaif.h" : YES 00:01:32.275 Library fdt found: NO 00:01:32.275 Library execinfo found: NO 00:01:32.275 Has header "execinfo.h" : YES 00:01:32.275 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:32.275 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:32.275 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:32.275 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:32.275 Run-time dependency openssl found: YES 3.1.1 00:01:32.275 Run-time dependency libpcap found: YES 1.10.4 00:01:32.275 Has header "pcap.h" with dependency libpcap: YES 00:01:32.275 Compiler for C supports arguments -Wcast-qual: YES 00:01:32.275 Compiler for C supports arguments -Wdeprecated: YES 00:01:32.275 Compiler for C supports arguments -Wformat: YES 00:01:32.275 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:32.275 Compiler for C supports arguments -Wformat-security: NO 00:01:32.275 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:32.275 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:32.275 Compiler for C supports arguments -Wnested-externs: YES 00:01:32.275 Compiler for C supports arguments -Wold-style-definition: YES 00:01:32.275 Compiler for C supports arguments -Wpointer-arith: YES 00:01:32.275 Compiler for C supports arguments -Wsign-compare: YES 00:01:32.275 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:32.275 Compiler for C supports arguments -Wundef: YES 00:01:32.275 Compiler for C supports arguments -Wwrite-strings: YES 00:01:32.275 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:32.275 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:32.275 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:32.275 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:32.275 Compiler for C supports arguments -mavx512f: YES 00:01:32.275 Checking if "AVX512 checking" compiles: YES 00:01:32.275 Fetching value of define "__SSE4_2__" : 1 00:01:32.275 Fetching value of define "__AES__" : 1 00:01:32.275 Fetching value of define "__AVX__" : 1 00:01:32.275 Fetching value of define "__AVX2__" : 1 00:01:32.275 Fetching value of define "__AVX512BW__" : 1 00:01:32.275 Fetching value of define "__AVX512CD__" : 1 00:01:32.275 Fetching value of define "__AVX512DQ__" : 1 00:01:32.275 Fetching value of define "__AVX512F__" : 1 00:01:32.275 Fetching value of define "__AVX512VL__" : 1 00:01:32.275 Fetching value of define "__PCLMUL__" : 1 00:01:32.275 Fetching value of define "__RDRND__" : 1 00:01:32.275 Fetching value of define "__RDSEED__" : 1 00:01:32.275 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:32.275 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:32.275 Message: lib/kvargs: Defining dependency "kvargs" 00:01:32.276 Message: lib/telemetry: Defining dependency "telemetry" 00:01:32.276 Checking for function "getentropy" : YES 00:01:32.276 Message: lib/eal: Defining dependency "eal" 00:01:32.276 Message: lib/ring: Defining dependency "ring" 00:01:32.276 Message: lib/rcu: Defining dependency "rcu" 00:01:32.276 Message: lib/mempool: Defining dependency "mempool" 00:01:32.276 Message: lib/mbuf: Defining dependency "mbuf" 00:01:32.276 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:32.276 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:32.276 Compiler for C supports arguments -mpclmul: YES 00:01:32.276 Compiler for C supports arguments -maes: YES 00:01:32.276 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:32.276 Compiler for C supports arguments -mavx512bw: YES 00:01:32.276 Compiler for C supports arguments -mavx512dq: YES 00:01:32.276 Compiler for C supports arguments -mavx512vl: YES 00:01:32.276 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:32.276 Compiler for C supports arguments -mavx2: YES 00:01:32.276 Compiler for C supports arguments -mavx: YES 00:01:32.276 Message: lib/net: Defining dependency "net" 00:01:32.276 Message: lib/meter: Defining dependency "meter" 00:01:32.276 Message: lib/ethdev: Defining dependency "ethdev" 00:01:32.276 Message: lib/pci: Defining dependency "pci" 00:01:32.276 Message: lib/cmdline: Defining dependency "cmdline" 00:01:32.276 Message: lib/metrics: Defining dependency "metrics" 00:01:32.276 Message: lib/hash: Defining dependency "hash" 00:01:32.276 Message: lib/timer: Defining dependency "timer" 00:01:32.276 Fetching value of define "__AVX2__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:32.276 Message: lib/acl: Defining dependency "acl" 00:01:32.276 Message: lib/bbdev: Defining dependency "bbdev" 00:01:32.276 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:32.276 Run-time dependency libelf found: YES 0.191 00:01:32.276 Message: lib/bpf: Defining dependency "bpf" 00:01:32.276 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:32.276 Message: lib/compressdev: Defining dependency "compressdev" 00:01:32.276 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:32.276 Message: lib/distributor: Defining dependency "distributor" 00:01:32.276 Message: lib/efd: Defining dependency "efd" 00:01:32.276 Message: lib/eventdev: Defining dependency "eventdev" 00:01:32.276 Message: lib/gpudev: Defining dependency "gpudev" 00:01:32.276 Message: lib/gro: Defining dependency "gro" 00:01:32.276 Message: lib/gso: Defining dependency "gso" 00:01:32.276 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:32.276 Message: lib/jobstats: Defining dependency "jobstats" 00:01:32.276 Message: lib/latencystats: Defining dependency "latencystats" 00:01:32.276 Message: lib/lpm: Defining dependency "lpm" 00:01:32.276 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:32.276 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:32.276 Message: lib/member: Defining dependency "member" 00:01:32.276 Message: lib/pcapng: Defining dependency "pcapng" 00:01:32.276 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:32.276 Message: lib/power: Defining dependency "power" 00:01:32.276 Message: lib/rawdev: Defining dependency "rawdev" 00:01:32.276 Message: lib/regexdev: Defining dependency "regexdev" 00:01:32.276 Message: lib/dmadev: Defining dependency "dmadev" 00:01:32.276 Message: lib/rib: Defining dependency "rib" 00:01:32.276 Message: lib/reorder: Defining dependency "reorder" 00:01:32.276 Message: lib/sched: Defining dependency "sched" 00:01:32.276 Message: lib/security: Defining dependency "security" 00:01:32.276 Message: lib/stack: Defining dependency "stack" 00:01:32.276 Has header "linux/userfaultfd.h" : YES 00:01:32.276 Message: lib/vhost: Defining dependency "vhost" 00:01:32.276 Message: lib/ipsec: Defining dependency "ipsec" 00:01:32.276 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:32.276 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:32.276 Message: lib/fib: Defining dependency "fib" 00:01:32.276 Message: lib/port: Defining dependency "port" 00:01:32.276 Message: lib/pdump: Defining dependency "pdump" 00:01:32.276 Message: lib/table: Defining dependency "table" 00:01:32.276 Message: lib/pipeline: Defining dependency "pipeline" 00:01:32.276 Message: lib/graph: Defining dependency "graph" 00:01:32.276 Message: lib/node: Defining dependency "node" 00:01:32.276 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:32.276 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:32.276 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:32.276 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:32.276 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:32.276 Compiler for C supports arguments -Wno-unused-value: YES 00:01:32.276 Compiler for C supports arguments -Wno-format: YES 00:01:32.276 Compiler for C supports arguments -Wno-format-security: YES 00:01:32.276 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:32.854 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:32.854 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:32.854 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:32.854 Fetching value of define "__AVX2__" : 1 (cached) 00:01:32.854 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:32.854 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:32.854 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:32.854 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:32.854 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:32.854 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:32.854 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:32.854 Configuring doxy-api.conf using configuration 00:01:32.854 Program sphinx-build found: NO 00:01:32.854 Configuring rte_build_config.h using configuration 00:01:32.854 Message: 00:01:32.854 ================= 00:01:32.854 Applications Enabled 00:01:32.854 ================= 00:01:32.854 00:01:32.854 apps: 00:01:32.854 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:01:32.854 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:01:32.854 test-security-perf, 00:01:32.854 00:01:32.854 Message: 00:01:32.854 ================= 00:01:32.854 Libraries Enabled 00:01:32.854 ================= 00:01:32.854 00:01:32.854 libs: 00:01:32.854 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:01:32.854 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:01:32.854 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:01:32.854 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:01:32.854 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:01:32.854 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:01:32.854 table, pipeline, graph, node, 00:01:32.854 00:01:32.854 Message: 00:01:32.854 =============== 00:01:32.854 Drivers Enabled 00:01:32.854 =============== 00:01:32.854 00:01:32.854 common: 00:01:32.854 00:01:32.854 bus: 00:01:32.854 pci, vdev, 00:01:32.854 mempool: 00:01:32.854 ring, 00:01:32.854 dma: 00:01:32.854 00:01:32.854 net: 00:01:32.854 i40e, 00:01:32.854 raw: 00:01:32.854 00:01:32.854 crypto: 00:01:32.854 00:01:32.854 compress: 00:01:32.854 00:01:32.854 regex: 00:01:32.854 00:01:32.854 vdpa: 00:01:32.854 00:01:32.854 event: 00:01:32.854 00:01:32.854 baseband: 00:01:32.854 00:01:32.854 gpu: 00:01:32.854 00:01:32.854 00:01:32.854 Message: 00:01:32.854 ================= 00:01:32.854 Content Skipped 00:01:32.854 ================= 00:01:32.854 00:01:32.854 apps: 00:01:32.854 00:01:32.854 libs: 00:01:32.854 kni: explicitly disabled via build config (deprecated lib) 00:01:32.854 flow_classify: explicitly disabled via build config (deprecated lib) 00:01:32.854 00:01:32.854 drivers: 00:01:32.854 common/cpt: not in enabled drivers build config 00:01:32.854 common/dpaax: not in enabled drivers build config 00:01:32.854 common/iavf: not in enabled drivers build config 00:01:32.854 common/idpf: not in enabled drivers build config 00:01:32.854 common/mvep: not in enabled drivers build config 00:01:32.854 common/octeontx: not in enabled drivers build config 00:01:32.854 bus/auxiliary: not in enabled drivers build config 00:01:32.855 bus/dpaa: not in enabled drivers build config 00:01:32.855 bus/fslmc: not in enabled drivers build config 00:01:32.855 bus/ifpga: not in enabled drivers build config 00:01:32.855 bus/vmbus: not in enabled drivers build config 00:01:32.855 common/cnxk: not in enabled drivers build config 00:01:32.855 common/mlx5: not in enabled drivers build config 00:01:32.855 common/qat: not in enabled drivers build config 00:01:32.855 common/sfc_efx: not in enabled drivers build config 00:01:32.855 mempool/bucket: not in enabled drivers build config 00:01:32.855 mempool/cnxk: not in enabled drivers build config 00:01:32.855 mempool/dpaa: not in enabled drivers build config 00:01:32.855 mempool/dpaa2: not in enabled drivers build config 00:01:32.855 mempool/octeontx: not in enabled drivers build config 00:01:32.855 mempool/stack: not in enabled drivers build config 00:01:32.855 dma/cnxk: not in enabled drivers build config 00:01:32.855 dma/dpaa: not in enabled drivers build config 00:01:32.855 dma/dpaa2: not in enabled drivers build config 00:01:32.855 dma/hisilicon: not in enabled drivers build config 00:01:32.855 dma/idxd: not in enabled drivers build config 00:01:32.855 dma/ioat: not in enabled drivers build config 00:01:32.855 dma/skeleton: not in enabled drivers build config 00:01:32.855 net/af_packet: not in enabled drivers build config 00:01:32.855 net/af_xdp: not in enabled drivers build config 00:01:32.855 net/ark: not in enabled drivers build config 00:01:32.855 net/atlantic: not in enabled drivers build config 00:01:32.855 net/avp: not in enabled drivers build config 00:01:32.855 net/axgbe: not in enabled drivers build config 00:01:32.855 net/bnx2x: not in enabled drivers build config 00:01:32.855 net/bnxt: not in enabled drivers build config 00:01:32.855 net/bonding: not in enabled drivers build config 00:01:32.855 net/cnxk: not in enabled drivers build config 00:01:32.855 net/cxgbe: not in enabled drivers build config 00:01:32.855 net/dpaa: not in enabled drivers build config 00:01:32.855 net/dpaa2: not in enabled drivers build config 00:01:32.855 net/e1000: not in enabled drivers build config 00:01:32.855 net/ena: not in enabled drivers build config 00:01:32.855 net/enetc: not in enabled drivers build config 00:01:32.855 net/enetfec: not in enabled drivers build config 00:01:32.855 net/enic: not in enabled drivers build config 00:01:32.855 net/failsafe: not in enabled drivers build config 00:01:32.855 net/fm10k: not in enabled drivers build config 00:01:32.855 net/gve: not in enabled drivers build config 00:01:32.855 net/hinic: not in enabled drivers build config 00:01:32.855 net/hns3: not in enabled drivers build config 00:01:32.855 net/iavf: not in enabled drivers build config 00:01:32.855 net/ice: not in enabled drivers build config 00:01:32.855 net/idpf: not in enabled drivers build config 00:01:32.855 net/igc: not in enabled drivers build config 00:01:32.855 net/ionic: not in enabled drivers build config 00:01:32.855 net/ipn3ke: not in enabled drivers build config 00:01:32.855 net/ixgbe: not in enabled drivers build config 00:01:32.855 net/kni: not in enabled drivers build config 00:01:32.855 net/liquidio: not in enabled drivers build config 00:01:32.855 net/mana: not in enabled drivers build config 00:01:32.855 net/memif: not in enabled drivers build config 00:01:32.855 net/mlx4: not in enabled drivers build config 00:01:32.855 net/mlx5: not in enabled drivers build config 00:01:32.855 net/mvneta: not in enabled drivers build config 00:01:32.855 net/mvpp2: not in enabled drivers build config 00:01:32.855 net/netvsc: not in enabled drivers build config 00:01:32.855 net/nfb: not in enabled drivers build config 00:01:32.855 net/nfp: not in enabled drivers build config 00:01:32.855 net/ngbe: not in enabled drivers build config 00:01:32.855 net/null: not in enabled drivers build config 00:01:32.855 net/octeontx: not in enabled drivers build config 00:01:32.855 net/octeon_ep: not in enabled drivers build config 00:01:32.855 net/pcap: not in enabled drivers build config 00:01:32.855 net/pfe: not in enabled drivers build config 00:01:32.855 net/qede: not in enabled drivers build config 00:01:32.855 net/ring: not in enabled drivers build config 00:01:32.855 net/sfc: not in enabled drivers build config 00:01:32.855 net/softnic: not in enabled drivers build config 00:01:32.855 net/tap: not in enabled drivers build config 00:01:32.855 net/thunderx: not in enabled drivers build config 00:01:32.855 net/txgbe: not in enabled drivers build config 00:01:32.855 net/vdev_netvsc: not in enabled drivers build config 00:01:32.855 net/vhost: not in enabled drivers build config 00:01:32.855 net/virtio: not in enabled drivers build config 00:01:32.855 net/vmxnet3: not in enabled drivers build config 00:01:32.855 raw/cnxk_bphy: not in enabled drivers build config 00:01:32.855 raw/cnxk_gpio: not in enabled drivers build config 00:01:32.855 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:32.855 raw/ifpga: not in enabled drivers build config 00:01:32.855 raw/ntb: not in enabled drivers build config 00:01:32.855 raw/skeleton: not in enabled drivers build config 00:01:32.855 crypto/armv8: not in enabled drivers build config 00:01:32.855 crypto/bcmfs: not in enabled drivers build config 00:01:32.855 crypto/caam_jr: not in enabled drivers build config 00:01:32.855 crypto/ccp: not in enabled drivers build config 00:01:32.855 crypto/cnxk: not in enabled drivers build config 00:01:32.855 crypto/dpaa_sec: not in enabled drivers build config 00:01:32.855 crypto/dpaa2_sec: not in enabled drivers build config 00:01:32.855 crypto/ipsec_mb: not in enabled drivers build config 00:01:32.855 crypto/mlx5: not in enabled drivers build config 00:01:32.855 crypto/mvsam: not in enabled drivers build config 00:01:32.855 crypto/nitrox: not in enabled drivers build config 00:01:32.855 crypto/null: not in enabled drivers build config 00:01:32.855 crypto/octeontx: not in enabled drivers build config 00:01:32.855 crypto/openssl: not in enabled drivers build config 00:01:32.855 crypto/scheduler: not in enabled drivers build config 00:01:32.855 crypto/uadk: not in enabled drivers build config 00:01:32.855 crypto/virtio: not in enabled drivers build config 00:01:32.855 compress/isal: not in enabled drivers build config 00:01:32.855 compress/mlx5: not in enabled drivers build config 00:01:32.855 compress/octeontx: not in enabled drivers build config 00:01:32.855 compress/zlib: not in enabled drivers build config 00:01:32.855 regex/mlx5: not in enabled drivers build config 00:01:32.855 regex/cn9k: not in enabled drivers build config 00:01:32.855 vdpa/ifc: not in enabled drivers build config 00:01:32.855 vdpa/mlx5: not in enabled drivers build config 00:01:32.855 vdpa/sfc: not in enabled drivers build config 00:01:32.855 event/cnxk: not in enabled drivers build config 00:01:32.855 event/dlb2: not in enabled drivers build config 00:01:32.855 event/dpaa: not in enabled drivers build config 00:01:32.855 event/dpaa2: not in enabled drivers build config 00:01:32.855 event/dsw: not in enabled drivers build config 00:01:32.855 event/opdl: not in enabled drivers build config 00:01:32.855 event/skeleton: not in enabled drivers build config 00:01:32.855 event/sw: not in enabled drivers build config 00:01:32.855 event/octeontx: not in enabled drivers build config 00:01:32.855 baseband/acc: not in enabled drivers build config 00:01:32.855 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:32.855 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:32.855 baseband/la12xx: not in enabled drivers build config 00:01:32.855 baseband/null: not in enabled drivers build config 00:01:32.855 baseband/turbo_sw: not in enabled drivers build config 00:01:32.855 gpu/cuda: not in enabled drivers build config 00:01:32.855 00:01:32.855 00:01:32.855 Build targets in project: 311 00:01:32.855 00:01:32.855 DPDK 22.11.4 00:01:32.855 00:01:32.855 User defined options 00:01:32.855 libdir : lib 00:01:32.855 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:32.855 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:32.855 c_link_args : 00:01:32.855 enable_docs : false 00:01:32.855 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:32.855 enable_kmods : false 00:01:32.855 machine : native 00:01:32.855 tests : false 00:01:32.855 00:01:32.855 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:32.855 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:32.855 05:26:43 -- common/autobuild_common.sh@189 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:32.855 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:32.855 [1/740] Generating lib/rte_kvargs_mingw with a custom command 00:01:32.856 [2/740] Generating lib/rte_telemetry_def with a custom command 00:01:32.856 [3/740] Generating lib/rte_telemetry_mingw with a custom command 00:01:32.856 [4/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:32.856 [5/740] Generating lib/rte_kvargs_def with a custom command 00:01:32.856 [6/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:32.856 [7/740] Generating lib/rte_ring_mingw with a custom command 00:01:32.856 [8/740] Generating lib/rte_eal_mingw with a custom command 00:01:32.856 [9/740] Generating lib/rte_ring_def with a custom command 00:01:32.856 [10/740] Generating lib/rte_rcu_mingw with a custom command 00:01:32.856 [11/740] Generating lib/rte_mempool_def with a custom command 00:01:32.856 [12/740] Generating lib/rte_mempool_mingw with a custom command 00:01:33.120 [13/740] Generating lib/rte_mbuf_def with a custom command 00:01:33.120 [14/740] Generating lib/rte_eal_def with a custom command 00:01:33.120 [15/740] Generating lib/rte_rcu_def with a custom command 00:01:33.120 [16/740] Generating lib/rte_mbuf_mingw with a custom command 00:01:33.120 [17/740] Generating lib/rte_meter_mingw with a custom command 00:01:33.120 [18/740] Generating lib/rte_net_def with a custom command 00:01:33.120 [19/740] Generating lib/rte_net_mingw with a custom command 00:01:33.120 [20/740] Generating lib/rte_meter_def with a custom command 00:01:33.120 [21/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:33.120 [22/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:33.120 [23/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:33.120 [24/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:33.120 [25/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:33.120 [26/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:01:33.120 [27/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:33.120 [28/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:33.120 [29/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:33.120 [30/740] Generating lib/rte_pci_mingw with a custom command 00:01:33.120 [31/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:33.120 [32/740] Generating lib/rte_ethdev_mingw with a custom command 00:01:33.120 [33/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:33.120 [34/740] Generating lib/rte_ethdev_def with a custom command 00:01:33.120 [35/740] Generating lib/rte_pci_def with a custom command 00:01:33.120 [36/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:33.120 [37/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:33.121 [38/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:33.121 [39/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:33.121 [40/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:33.121 [41/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:33.121 [42/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:33.121 [43/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:33.121 [44/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:33.121 [45/740] Generating lib/rte_metrics_def with a custom command 00:01:33.121 [46/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:33.121 [47/740] Generating lib/rte_cmdline_def with a custom command 00:01:33.121 [48/740] Generating lib/rte_cmdline_mingw with a custom command 00:01:33.121 [49/740] Linking static target lib/librte_kvargs.a 00:01:33.121 [50/740] Generating lib/rte_metrics_mingw with a custom command 00:01:33.121 [51/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:33.121 [52/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:33.121 [53/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:33.121 [54/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:33.121 [55/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:33.121 [56/740] Generating lib/rte_hash_def with a custom command 00:01:33.121 [57/740] Generating lib/rte_hash_mingw with a custom command 00:01:33.121 [58/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:33.121 [59/740] Generating lib/rte_timer_mingw with a custom command 00:01:33.121 [60/740] Generating lib/rte_timer_def with a custom command 00:01:33.121 [61/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:33.121 [62/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:33.121 [63/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:33.121 [64/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:33.121 [65/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:33.121 [66/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:33.121 [67/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:33.121 [68/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:33.121 [69/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:33.121 [70/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:33.121 [71/740] Generating lib/rte_bbdev_def with a custom command 00:01:33.121 [72/740] Generating lib/rte_acl_def with a custom command 00:01:33.121 [73/740] Generating lib/rte_acl_mingw with a custom command 00:01:33.121 [74/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:33.121 [75/740] Generating lib/rte_bbdev_mingw with a custom command 00:01:33.121 [76/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:33.121 [77/740] Generating lib/rte_bitratestats_def with a custom command 00:01:33.121 [78/740] Generating lib/rte_bitratestats_mingw with a custom command 00:01:33.121 [79/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:33.121 [80/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:33.121 [81/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:33.121 [82/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:33.121 [83/740] Linking static target lib/librte_pci.a 00:01:33.121 [84/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:33.121 [85/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:33.121 [86/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:33.121 [87/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:33.121 [88/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:33.121 [89/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:33.121 [90/740] Linking static target lib/librte_meter.a 00:01:33.121 [91/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:33.383 [92/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:33.383 [93/740] Generating lib/rte_cfgfile_mingw with a custom command 00:01:33.383 [94/740] Generating lib/rte_cfgfile_def with a custom command 00:01:33.383 [95/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:33.383 [96/740] Generating lib/rte_bpf_def with a custom command 00:01:33.383 [97/740] Generating lib/rte_bpf_mingw with a custom command 00:01:33.383 [98/740] Generating lib/rte_compressdev_def with a custom command 00:01:33.383 [99/740] Generating lib/rte_compressdev_mingw with a custom command 00:01:33.383 [100/740] Linking static target lib/librte_ring.a 00:01:33.383 [101/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:33.383 [102/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:33.383 [103/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:33.383 [104/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:33.383 [105/740] Generating lib/rte_cryptodev_def with a custom command 00:01:33.383 [106/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:33.383 [107/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:33.383 [108/740] Generating lib/rte_cryptodev_mingw with a custom command 00:01:33.383 [109/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:33.383 [110/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:33.383 [111/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:01:33.383 [112/740] Generating lib/rte_distributor_def with a custom command 00:01:33.383 [113/740] Generating lib/rte_distributor_mingw with a custom command 00:01:33.383 [114/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:33.383 [115/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:33.383 [116/740] Generating lib/rte_efd_mingw with a custom command 00:01:33.383 [117/740] Generating lib/rte_efd_def with a custom command 00:01:33.383 [118/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:33.383 [119/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:33.383 [120/740] Generating lib/rte_eventdev_mingw with a custom command 00:01:33.383 [121/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:33.383 [122/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:33.383 [123/740] Generating lib/rte_eventdev_def with a custom command 00:01:33.383 [124/740] Generating lib/rte_gpudev_mingw with a custom command 00:01:33.383 [125/740] Generating lib/rte_gro_def with a custom command 00:01:33.383 [126/740] Generating lib/rte_gpudev_def with a custom command 00:01:33.384 [127/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:33.384 [128/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:33.384 [129/740] Generating lib/rte_gro_mingw with a custom command 00:01:33.384 [130/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:33.384 [131/740] Generating lib/rte_gso_mingw with a custom command 00:01:33.384 [132/740] Generating lib/rte_gso_def with a custom command 00:01:33.384 [133/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:33.647 [134/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:33.647 [135/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.647 [136/740] Generating lib/rte_ip_frag_def with a custom command 00:01:33.647 [137/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:33.647 [138/740] Generating lib/rte_ip_frag_mingw with a custom command 00:01:33.647 [139/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.647 [140/740] Linking target lib/librte_kvargs.so.23.0 00:01:33.647 [141/740] Generating lib/rte_jobstats_mingw with a custom command 00:01:33.647 [142/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:33.647 [143/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:33.647 [144/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.647 [145/740] Generating lib/rte_jobstats_def with a custom command 00:01:33.647 [146/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:33.647 [147/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:33.647 [148/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:33.647 [149/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:33.647 [150/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:33.647 [151/740] Linking static target lib/librte_cfgfile.a 00:01:33.647 [152/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:33.647 [153/740] Generating lib/rte_latencystats_mingw with a custom command 00:01:33.647 [154/740] Generating lib/rte_lpm_def with a custom command 00:01:33.647 [155/740] Generating lib/rte_latencystats_def with a custom command 00:01:33.647 [156/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:33.647 [157/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:33.647 [158/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:33.647 [159/740] Generating lib/rte_lpm_mingw with a custom command 00:01:33.647 [160/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:33.647 [161/740] Generating lib/rte_member_def with a custom command 00:01:33.647 [162/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.647 [163/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:33.647 [164/740] Generating lib/rte_member_mingw with a custom command 00:01:33.647 [165/740] Generating lib/rte_pcapng_def with a custom command 00:01:33.647 [166/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:33.647 [167/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:33.647 [168/740] Generating lib/rte_pcapng_mingw with a custom command 00:01:33.647 [169/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:33.647 [170/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:33.647 [171/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:33.647 [172/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:33.647 [173/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:33.647 [174/740] Linking static target lib/librte_jobstats.a 00:01:33.647 [175/740] Linking static target lib/librte_cmdline.a 00:01:33.910 [176/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:33.910 [177/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:33.910 [178/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:33.910 [179/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:33.910 [180/740] Generating lib/rte_power_def with a custom command 00:01:33.910 [181/740] Generating lib/rte_power_mingw with a custom command 00:01:33.910 [182/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:33.910 [183/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:33.910 [184/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:33.910 [185/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:33.910 [186/740] Linking static target lib/librte_timer.a 00:01:33.910 [187/740] Linking static target lib/librte_telemetry.a 00:01:33.910 [188/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:33.910 [189/740] Linking static target lib/librte_metrics.a 00:01:33.910 [190/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:33.910 [191/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:01:33.910 [192/740] Generating lib/rte_rawdev_def with a custom command 00:01:33.910 [193/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:33.910 [194/740] Generating lib/rte_rawdev_mingw with a custom command 00:01:33.910 [195/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:33.910 [196/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:33.910 [197/740] Generating lib/rte_regexdev_def with a custom command 00:01:33.910 [198/740] Generating lib/rte_dmadev_def with a custom command 00:01:33.910 [199/740] Generating lib/rte_regexdev_mingw with a custom command 00:01:33.910 [200/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:33.910 [201/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:33.910 [202/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:33.910 [203/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:33.910 [204/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:33.910 [205/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:33.910 [206/740] Generating lib/rte_dmadev_mingw with a custom command 00:01:33.910 [207/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:33.910 [208/740] Generating lib/rte_rib_def with a custom command 00:01:33.910 [209/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:33.910 [210/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:33.910 [211/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:33.910 [212/740] Generating lib/rte_rib_mingw with a custom command 00:01:33.910 [213/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:33.910 [214/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:33.910 [215/740] Generating lib/rte_reorder_def with a custom command 00:01:33.910 [216/740] Generating lib/rte_reorder_mingw with a custom command 00:01:33.910 [217/740] Generating lib/rte_sched_def with a custom command 00:01:33.910 [218/740] Generating lib/rte_sched_mingw with a custom command 00:01:33.910 [219/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:33.910 [220/740] Generating lib/rte_security_def with a custom command 00:01:33.910 [221/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:33.910 [222/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:33.910 [223/740] Generating lib/rte_security_mingw with a custom command 00:01:33.910 [224/740] Linking static target lib/librte_net.a 00:01:33.910 [225/740] Linking static target lib/librte_bitratestats.a 00:01:33.910 [226/740] Generating lib/rte_stack_def with a custom command 00:01:33.910 [227/740] Generating lib/rte_stack_mingw with a custom command 00:01:33.910 [228/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:33.910 [229/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:33.910 [230/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:33.910 [231/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:33.910 [232/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:33.910 [233/740] Generating lib/rte_vhost_mingw with a custom command 00:01:33.910 [234/740] Generating lib/rte_vhost_def with a custom command 00:01:33.910 [235/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:33.910 [236/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:33.910 [237/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:33.910 [238/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:33.910 [239/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:33.910 [240/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:33.910 [241/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:33.910 [242/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:33.910 [243/740] Generating lib/rte_ipsec_mingw with a custom command 00:01:33.910 [244/740] Generating lib/rte_ipsec_def with a custom command 00:01:33.910 [245/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:34.168 [246/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:34.168 [247/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:34.168 [248/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:34.168 [249/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:34.168 [250/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:01:34.168 [251/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:34.168 [252/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:34.168 [253/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:34.168 [254/740] Generating lib/rte_fib_def with a custom command 00:01:34.168 [255/740] Generating lib/rte_fib_mingw with a custom command 00:01:34.168 [256/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:34.168 [257/740] Linking static target lib/librte_stack.a 00:01:34.168 [258/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:34.168 [259/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:34.168 [260/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:34.168 [261/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:34.168 [262/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:34.168 [263/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:34.168 [264/740] Generating lib/rte_port_def with a custom command 00:01:34.168 [265/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:34.168 [266/740] Generating lib/rte_port_mingw with a custom command 00:01:34.168 [267/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:34.168 [268/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:34.168 [269/740] Generating lib/rte_pdump_def with a custom command 00:01:34.168 [270/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:34.168 [271/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:34.168 [272/740] Generating lib/rte_pdump_mingw with a custom command 00:01:34.168 [273/740] Linking static target lib/librte_compressdev.a 00:01:34.168 [274/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:34.168 [275/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.168 [276/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:34.168 [277/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:34.168 [278/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.169 [279/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:34.169 [280/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:34.169 [281/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:34.169 [282/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:34.169 [283/740] Linking static target lib/librte_rcu.a 00:01:34.169 [284/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:34.169 [285/740] Linking static target lib/librte_rawdev.a 00:01:34.169 [286/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:34.169 [287/740] Linking static target lib/librte_mempool.a 00:01:34.431 [288/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.431 [289/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:34.431 [290/740] Generating lib/rte_table_def with a custom command 00:01:34.431 [291/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:34.431 [292/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.431 [293/740] Linking static target lib/librte_bbdev.a 00:01:34.431 [294/740] Generating lib/rte_table_mingw with a custom command 00:01:34.431 [295/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:34.431 [296/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:34.431 [297/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:34.431 [298/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:34.431 [299/740] Linking static target lib/librte_gro.a 00:01:34.431 [300/740] Linking static target lib/librte_dmadev.a 00:01:34.431 [301/740] Linking static target lib/librte_gpudev.a 00:01:34.431 [302/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.431 [303/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:34.431 [304/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.431 [305/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:34.431 [306/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:34.431 [307/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:34.431 [308/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:34.431 [309/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.431 [310/740] Generating lib/rte_pipeline_mingw with a custom command 00:01:34.431 [311/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.431 [312/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:34.431 [313/740] Generating lib/rte_pipeline_def with a custom command 00:01:34.431 [314/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:34.431 [315/740] Linking static target lib/librte_gso.a 00:01:34.431 [316/740] Linking target lib/librte_telemetry.so.23.0 00:01:34.431 [317/740] Linking static target lib/librte_latencystats.a 00:01:34.431 [318/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:34.431 [319/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:34.431 [320/740] Linking static target lib/librte_distributor.a 00:01:34.431 [321/740] Generating lib/rte_graph_mingw with a custom command 00:01:34.431 [322/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:01:34.431 [323/740] Generating lib/rte_graph_def with a custom command 00:01:34.431 [324/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:34.431 [325/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:34.431 [326/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:34.431 [327/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:34.431 [328/740] Linking static target lib/librte_ip_frag.a 00:01:34.696 [329/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:34.696 [330/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:34.696 [331/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:34.696 [332/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:34.696 [333/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:34.696 [334/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:34.696 [335/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:01:34.696 [336/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:34.696 [337/740] Linking static target lib/librte_regexdev.a 00:01:34.696 [338/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:34.696 [339/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:34.696 [340/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:34.696 [341/740] Generating lib/rte_node_def with a custom command 00:01:34.696 [342/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:34.696 [343/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:34.696 [344/740] Generating lib/rte_node_mingw with a custom command 00:01:34.696 [345/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:34.696 [346/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.696 [347/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.697 [348/740] Generating drivers/rte_bus_pci_def with a custom command 00:01:34.697 [349/740] Linking static target lib/librte_eal.a 00:01:34.697 [350/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:34.697 [351/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:34.697 [352/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:01:34.697 [353/740] Generating drivers/rte_bus_vdev_def with a custom command 00:01:34.697 [354/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:34.697 [355/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:34.697 [356/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.697 [357/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:34.697 [358/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.697 [359/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:34.697 [360/740] Linking static target lib/librte_power.a 00:01:34.697 [361/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:34.697 [362/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:01:34.697 [363/740] Linking static target lib/librte_reorder.a 00:01:34.697 [364/740] Generating drivers/rte_mempool_ring_def with a custom command 00:01:34.697 [365/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:34.697 [366/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:34.697 [367/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:34.697 [368/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:34.697 [369/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:01:34.964 [370/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:34.964 [371/740] Linking static target lib/librte_security.a 00:01:34.964 [372/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:34.964 [373/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.964 [374/740] Linking static target lib/librte_pcapng.a 00:01:34.964 [375/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:34.964 [376/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:34.964 [377/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:34.964 [378/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:34.964 [379/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:34.964 [380/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:34.964 [381/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:34.964 [382/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:34.964 [383/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:34.964 [384/740] Linking static target lib/librte_mbuf.a 00:01:34.964 [385/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:34.964 [386/740] Linking static target lib/librte_bpf.a 00:01:34.964 [387/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.964 [388/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:34.964 [389/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.964 [390/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:34.964 [391/740] Generating drivers/rte_net_i40e_def with a custom command 00:01:34.964 [392/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:01:34.964 [393/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:34.964 [394/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:34.964 [395/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:34.964 [396/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:34.964 [397/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:35.227 [398/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:35.227 [399/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:35.227 [400/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:35.227 [401/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:35.227 [402/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:35.227 [403/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:35.227 [404/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:35.227 [405/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:35.227 [406/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:35.227 [407/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:35.227 [408/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:35.227 [409/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:35.227 [410/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.227 [411/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:35.227 [412/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:35.227 [413/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.227 [414/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:35.227 [415/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.227 [416/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:35.227 [417/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:35.227 [418/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:35.227 [419/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:35.227 [420/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:35.227 [421/740] Linking static target lib/librte_lpm.a 00:01:35.227 [422/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:35.227 [423/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:35.227 [424/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:35.227 [425/740] Linking static target lib/librte_rib.a 00:01:35.227 [426/740] Linking static target lib/librte_graph.a 00:01:35.227 [427/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:35.227 [428/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.227 [429/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:35.227 [430/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:35.227 [431/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:35.227 [432/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:35.227 [433/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:35.227 [434/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:35.227 [435/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:35.227 [436/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:35.490 [437/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:35.490 [438/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:35.490 [439/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:35.490 [440/740] Linking static target lib/librte_efd.a 00:01:35.490 [441/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.490 [442/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:35.490 [443/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:35.490 [444/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:35.490 [445/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:35.490 [446/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:35.490 [447/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:35.490 [448/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:35.490 [449/740] Linking static target drivers/librte_bus_vdev.a 00:01:35.490 [450/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:35.490 [451/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:35.490 [452/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.490 [453/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:35.490 [454/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.490 [455/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:35.490 [456/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:35.490 [457/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:35.490 [458/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.490 [459/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.490 [460/740] Linking static target lib/librte_fib.a 00:01:35.756 [461/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:35.756 [462/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:35.756 [463/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:35.756 [464/740] Linking static target lib/librte_pdump.a 00:01:35.756 [465/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:35.756 [466/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.756 [467/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.756 [468/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:35.756 [469/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.756 [470/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:35.756 [471/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.756 [472/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:35.756 [473/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:35.756 [474/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:35.756 [475/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.756 [476/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:35.756 [477/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:36.015 [478/740] Linking static target drivers/librte_bus_pci.a 00:01:36.015 [479/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.015 [480/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:36.015 [481/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:36.015 [482/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:36.015 [483/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:36.015 [484/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:36.015 [485/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:36.015 [486/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:36.015 [487/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.015 [488/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:36.015 [489/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:36.015 [490/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:36.015 [491/740] Linking static target lib/librte_table.a 00:01:36.015 [492/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:36.015 [493/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:36.015 [494/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:36.015 [495/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:36.015 [496/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:36.015 [497/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:36.274 [498/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:36.274 [499/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:36.274 [500/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:36.274 [501/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.274 [502/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:36.274 [503/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:36.274 [504/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:36.274 [505/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:36.274 [506/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:36.275 [507/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.275 [508/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:36.275 [509/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:36.275 [510/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:36.275 [511/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:36.275 [512/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.275 [513/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:36.275 [514/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:36.275 [515/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:36.275 [516/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:36.275 [517/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:36.275 [518/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:36.275 [519/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:36.275 [520/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:36.275 [521/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:36.533 [522/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:36.533 [523/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:36.533 [524/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:36.533 [525/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.533 [526/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:36.533 [527/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:36.533 [528/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:36.533 [529/740] Linking static target lib/librte_cryptodev.a 00:01:36.533 [530/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:36.533 [531/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:36.533 [532/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:36.533 [533/740] Linking static target lib/librte_sched.a 00:01:36.533 [534/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:36.533 [535/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:36.533 [536/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:36.533 [537/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:36.533 [538/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:36.533 [539/740] Linking static target lib/librte_node.a 00:01:36.533 [540/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:36.533 [541/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:36.533 [542/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.533 [543/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:36.533 [544/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:36.533 [545/740] Linking static target lib/librte_ipsec.a 00:01:36.533 [546/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:36.533 [547/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:36.533 [548/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:36.533 [549/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:36.533 [550/740] Linking static target lib/librte_ethdev.a 00:01:36.533 [551/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:36.533 [552/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:36.533 [553/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:36.533 [554/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:36.792 [555/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:36.792 [556/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:36.792 [557/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:36.792 [558/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:36.792 [559/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:36.792 [560/740] Linking static target drivers/librte_mempool_ring.a 00:01:36.792 [561/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:36.792 [562/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:36.792 [563/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:36.792 [564/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:36.792 [565/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:36.792 [566/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:36.792 [567/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:36.792 [568/740] Linking static target lib/librte_member.a 00:01:36.792 [569/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:36.792 [570/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:36.792 [571/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:36.792 [572/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.792 [573/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:36.792 [574/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:36.792 [575/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:36.792 [576/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:36.792 [577/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:36.792 [578/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:36.792 [579/740] Linking static target lib/librte_port.a 00:01:36.792 [580/740] Linking static target lib/librte_eventdev.a 00:01:36.792 [581/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:36.792 [582/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:36.792 [583/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:36.792 [584/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:36.792 [585/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:36.792 [586/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.792 [587/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:36.792 [588/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:37.050 [589/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:37.050 [590/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:37.050 [591/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:37.050 [592/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:01:37.050 [593/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:37.050 [594/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:37.050 [595/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.050 [596/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:37.050 [597/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.050 [598/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:37.050 [599/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:37.050 [600/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:37.308 [601/740] Linking static target lib/librte_hash.a 00:01:37.308 [602/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.308 [603/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:37.308 [604/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:37.308 [605/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:37.308 [606/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:01:37.308 [607/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:37.308 [608/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:37.308 [609/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:37.566 [610/740] Linking static target lib/librte_acl.a 00:01:37.566 [611/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:37.566 [612/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:37.825 [613/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.825 [614/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:37.825 [615/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.825 [616/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:37.825 [617/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:38.394 [618/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.394 [619/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:38.394 [620/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:38.654 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:39.592 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:39.592 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:39.851 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:39.851 [625/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:39.851 [626/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:39.851 [627/740] Linking static target drivers/librte_net_i40e.a 00:01:39.851 [628/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:40.111 [629/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.111 [630/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:40.111 [631/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.370 [632/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:40.938 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.215 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.215 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:46.215 [636/740] Linking static target lib/librte_vhost.a 00:01:47.154 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:01:47.154 [638/740] Linking static target lib/librte_pipeline.a 00:01:47.723 [639/740] Linking target app/dpdk-test-acl 00:01:47.723 [640/740] Linking target app/dpdk-dumpcap 00:01:47.723 [641/740] Linking target app/dpdk-test-bbdev 00:01:47.723 [642/740] Linking target app/dpdk-test-gpudev 00:01:47.723 [643/740] Linking target app/dpdk-test-security-perf 00:01:47.723 [644/740] Linking target app/dpdk-pdump 00:01:47.723 [645/740] Linking target app/dpdk-proc-info 00:01:47.723 [646/740] Linking target app/dpdk-test-sad 00:01:47.723 [647/740] Linking target app/dpdk-test-compress-perf 00:01:47.723 [648/740] Linking target app/dpdk-test-pipeline 00:01:47.723 [649/740] Linking target app/dpdk-test-regex 00:01:47.723 [650/740] Linking target app/dpdk-test-cmdline 00:01:47.723 [651/740] Linking target app/dpdk-test-flow-perf 00:01:47.723 [652/740] Linking target app/dpdk-test-fib 00:01:47.723 [653/740] Linking target app/dpdk-test-crypto-perf 00:01:47.723 [654/740] Linking target app/dpdk-test-eventdev 00:01:47.723 [655/740] Linking target app/dpdk-testpmd 00:01:48.293 [656/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.552 [657/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.813 [658/740] Linking target lib/librte_eal.so.23.0 00:01:48.813 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:01:48.813 [660/740] Linking target drivers/librte_bus_vdev.so.23.0 00:01:48.813 [661/740] Linking target lib/librte_ring.so.23.0 00:01:48.813 [662/740] Linking target lib/librte_rawdev.so.23.0 00:01:48.813 [663/740] Linking target lib/librte_pci.so.23.0 00:01:48.813 [664/740] Linking target lib/librte_timer.so.23.0 00:01:48.813 [665/740] Linking target lib/librte_meter.so.23.0 00:01:48.813 [666/740] Linking target lib/librte_cfgfile.so.23.0 00:01:48.813 [667/740] Linking target lib/librte_dmadev.so.23.0 00:01:48.813 [668/740] Linking target lib/librte_stack.so.23.0 00:01:48.813 [669/740] Linking target lib/librte_jobstats.so.23.0 00:01:48.813 [670/740] Linking target lib/librte_graph.so.23.0 00:01:48.813 [671/740] Linking target lib/librte_acl.so.23.0 00:01:49.073 [672/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:01:49.073 [673/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:01:49.073 [674/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:01:49.073 [675/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:01:49.073 [676/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:01:49.073 [677/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:01:49.073 [678/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:01:49.073 [679/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:01:49.073 [680/740] Linking target lib/librte_mempool.so.23.0 00:01:49.073 [681/740] Linking target lib/librte_rcu.so.23.0 00:01:49.073 [682/740] Linking target drivers/librte_bus_pci.so.23.0 00:01:49.073 [683/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:01:49.073 [684/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:01:49.073 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:01:49.332 [686/740] Linking target drivers/librte_mempool_ring.so.23.0 00:01:49.332 [687/740] Linking target lib/librte_rib.so.23.0 00:01:49.333 [688/740] Linking target lib/librte_mbuf.so.23.0 00:01:49.333 [689/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:01:49.333 [690/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:01:49.333 [691/740] Linking target lib/librte_fib.so.23.0 00:01:49.333 [692/740] Linking target lib/librte_bbdev.so.23.0 00:01:49.333 [693/740] Linking target lib/librte_distributor.so.23.0 00:01:49.333 [694/740] Linking target lib/librte_compressdev.so.23.0 00:01:49.333 [695/740] Linking target lib/librte_net.so.23.0 00:01:49.333 [696/740] Linking target lib/librte_gpudev.so.23.0 00:01:49.333 [697/740] Linking target lib/librte_regexdev.so.23.0 00:01:49.333 [698/740] Linking target lib/librte_reorder.so.23.0 00:01:49.333 [699/740] Linking target lib/librte_cryptodev.so.23.0 00:01:49.333 [700/740] Linking target lib/librte_sched.so.23.0 00:01:49.592 [701/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:01:49.592 [702/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:01:49.592 [703/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:01:49.592 [704/740] Linking target lib/librte_cmdline.so.23.0 00:01:49.592 [705/740] Linking target lib/librte_hash.so.23.0 00:01:49.592 [706/740] Linking target lib/librte_security.so.23.0 00:01:49.592 [707/740] Linking target lib/librte_ethdev.so.23.0 00:01:49.851 [708/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:01:49.851 [709/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:01:49.851 [710/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:01:49.851 [711/740] Linking target lib/librte_lpm.so.23.0 00:01:49.851 [712/740] Linking target lib/librte_ip_frag.so.23.0 00:01:49.851 [713/740] Linking target lib/librte_efd.so.23.0 00:01:49.851 [714/740] Linking target lib/librte_member.so.23.0 00:01:49.851 [715/740] Linking target lib/librte_metrics.so.23.0 00:01:49.851 [716/740] Linking target lib/librte_bpf.so.23.0 00:01:49.851 [717/740] Linking target lib/librte_pcapng.so.23.0 00:01:49.852 [718/740] Linking target lib/librte_ipsec.so.23.0 00:01:49.852 [719/740] Linking target lib/librte_gso.so.23.0 00:01:49.852 [720/740] Linking target lib/librte_gro.so.23.0 00:01:49.852 [721/740] Linking target lib/librte_power.so.23.0 00:01:49.852 [722/740] Linking target lib/librte_vhost.so.23.0 00:01:49.852 [723/740] Linking target lib/librte_eventdev.so.23.0 00:01:49.852 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:01:49.852 [725/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:01:49.852 [726/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:01:49.852 [727/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:01:49.852 [728/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:01:50.111 [729/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:01:50.111 [730/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:01:50.111 [731/740] Linking target lib/librte_node.so.23.0 00:01:50.111 [732/740] Linking target lib/librte_pdump.so.23.0 00:01:50.111 [733/740] Linking target lib/librte_bitratestats.so.23.0 00:01:50.111 [734/740] Linking target lib/librte_latencystats.so.23.0 00:01:50.111 [735/740] Linking target lib/librte_port.so.23.0 00:01:50.111 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:01:50.370 [737/740] Linking target lib/librte_table.so.23.0 00:01:50.370 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:01:52.275 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.275 [740/740] Linking target lib/librte_pipeline.so.23.0 00:01:52.533 05:27:03 -- common/autobuild_common.sh@190 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:01:52.533 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:52.533 [0/1] Installing files. 00:01:52.794 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:01:52.794 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.795 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.796 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:52.797 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:52.797 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.797 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:53.059 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:53.059 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:53.059 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.059 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:53.059 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.059 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.060 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:53.061 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:53.061 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:01:53.061 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:01:53.061 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:01:53.061 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:01:53.061 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:01:53.061 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:01:53.061 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:01:53.061 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:01:53.061 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:01:53.061 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:01:53.061 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:01:53.061 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:01:53.061 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:01:53.061 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:01:53.061 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:01:53.061 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:01:53.061 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:01:53.061 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:01:53.061 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:01:53.062 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:01:53.062 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:01:53.062 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:01:53.062 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:01:53.062 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:01:53.062 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:01:53.062 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:01:53.062 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:01:53.062 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:01:53.062 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:01:53.062 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:01:53.062 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:01:53.062 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:01:53.062 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:01:53.062 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:01:53.062 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:01:53.062 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:01:53.062 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:01:53.062 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:01:53.062 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:01:53.062 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:01:53.062 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:01:53.062 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:01:53.062 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:01:53.062 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:01:53.062 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:01:53.062 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:01:53.062 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:01:53.062 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:01:53.062 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:01:53.062 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:01:53.062 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:01:53.062 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:01:53.062 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:01:53.062 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:01:53.062 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:01:53.062 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:01:53.062 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:01:53.062 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:01:53.062 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:01:53.062 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:01:53.062 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:01:53.062 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:01:53.062 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:01:53.062 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:01:53.062 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:01:53.062 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:01:53.062 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:01:53.062 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:01:53.062 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:01:53.062 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:01:53.062 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:01:53.062 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:01:53.062 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:01:53.062 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:01:53.062 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:01:53.062 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:01:53.062 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:01:53.062 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:01:53.062 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:01:53.062 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:01:53.062 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:01:53.062 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:01:53.062 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:01:53.062 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:01:53.062 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:01:53.062 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:01:53.062 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:01:53.062 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:01:53.062 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:01:53.062 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:01:53.062 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:01:53.062 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:01:53.062 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:01:53.062 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:01:53.062 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:01:53.062 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:01:53.062 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:01:53.062 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:01:53.062 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:01:53.062 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:01:53.062 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:01:53.062 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:01:53.062 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:01:53.062 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:01:53.062 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:01:53.062 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:01:53.062 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:01:53.062 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:01:53.062 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:01:53.062 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:01:53.062 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:01:53.062 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:01:53.062 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:01:53.062 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:01:53.062 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:01:53.062 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:01:53.062 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:01:53.062 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:01:53.062 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:01:53.062 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:01:53.062 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:01:53.062 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:01:53.062 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:01:53.062 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:01:53.062 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:01:53.062 05:27:04 -- common/autobuild_common.sh@192 -- $ uname -s 00:01:53.062 05:27:04 -- common/autobuild_common.sh@192 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:01:53.062 05:27:04 -- common/autobuild_common.sh@203 -- $ cat 00:01:53.062 05:27:04 -- common/autobuild_common.sh@208 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:53.062 00:01:53.062 real 0m26.285s 00:01:53.062 user 6m35.062s 00:01:53.062 sys 2m13.718s 00:01:53.062 05:27:04 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:53.062 05:27:04 -- common/autotest_common.sh@10 -- $ set +x 00:01:53.062 ************************************ 00:01:53.062 END TEST build_native_dpdk 00:01:53.062 ************************************ 00:01:53.324 05:27:04 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:53.324 05:27:04 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:53.324 05:27:04 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:53.324 05:27:04 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:53.324 05:27:04 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:53.324 05:27:04 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:53.324 05:27:04 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:53.324 05:27:04 -- common/autotest_common.sh@10 -- $ set +x 00:01:53.324 ************************************ 00:01:53.324 START TEST autobuild_llvm_precompile 00:01:53.324 ************************************ 00:01:53.324 05:27:04 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:01:53.324 05:27:04 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:53.324 05:27:04 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:01:53.324 Target: x86_64-redhat-linux-gnu 00:01:53.324 Thread model: posix 00:01:53.324 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:53.324 05:27:04 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:01:53.324 05:27:04 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:01:53.324 05:27:04 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:01:53.324 05:27:04 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:01:53.324 05:27:04 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:01:53.324 05:27:04 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:01:53.324 05:27:04 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:53.324 05:27:04 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:01:53.324 05:27:04 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:01:53.324 05:27:04 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:53.324 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:01:53.582 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:53.582 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:53.839 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:54.098 Using 'verbs' RDMA provider 00:02:09.917 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:22.212 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:22.472 Creating mk/config.mk...done. 00:02:22.472 Creating mk/cc.flags.mk...done. 00:02:22.472 Type 'make' to build. 00:02:22.472 00:02:22.472 real 0m29.273s 00:02:22.472 user 0m12.781s 00:02:22.472 sys 0m15.929s 00:02:22.472 05:27:33 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:22.472 05:27:33 -- common/autotest_common.sh@10 -- $ set +x 00:02:22.472 ************************************ 00:02:22.472 END TEST autobuild_llvm_precompile 00:02:22.472 ************************************ 00:02:22.472 05:27:33 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:22.472 05:27:33 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:22.472 05:27:33 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:22.472 05:27:33 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:22.472 05:27:33 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:22.731 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:22.991 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:22.991 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:22.991 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:23.560 Using 'verbs' RDMA provider 00:02:36.342 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:48.560 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:48.560 Creating mk/config.mk...done. 00:02:48.560 Creating mk/cc.flags.mk...done. 00:02:48.560 Type 'make' to build. 00:02:48.560 05:27:58 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:48.560 05:27:58 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:02:48.560 05:27:58 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:02:48.560 05:27:58 -- common/autotest_common.sh@10 -- $ set +x 00:02:48.560 ************************************ 00:02:48.560 START TEST make 00:02:48.560 ************************************ 00:02:48.560 05:27:58 -- common/autotest_common.sh@1114 -- $ make -j112 00:02:48.560 make[1]: Nothing to be done for 'all'. 00:02:49.525 The Meson build system 00:02:49.525 Version: 1.5.0 00:02:49.525 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:49.525 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:49.525 Build type: native build 00:02:49.525 Project name: libvfio-user 00:02:49.525 Project version: 0.0.1 00:02:49.525 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:49.525 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:49.525 Host machine cpu family: x86_64 00:02:49.525 Host machine cpu: x86_64 00:02:49.525 Run-time dependency threads found: YES 00:02:49.525 Library dl found: YES 00:02:49.525 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:49.525 Run-time dependency json-c found: YES 0.17 00:02:49.525 Run-time dependency cmocka found: YES 1.1.7 00:02:49.525 Program pytest-3 found: NO 00:02:49.525 Program flake8 found: NO 00:02:49.525 Program misspell-fixer found: NO 00:02:49.525 Program restructuredtext-lint found: NO 00:02:49.525 Program valgrind found: YES (/usr/bin/valgrind) 00:02:49.525 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:49.525 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:49.525 Compiler for C supports arguments -Wwrite-strings: YES 00:02:49.525 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:49.525 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:49.525 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:49.525 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:49.525 Build targets in project: 8 00:02:49.525 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:49.525 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:49.525 00:02:49.525 libvfio-user 0.0.1 00:02:49.525 00:02:49.525 User defined options 00:02:49.525 buildtype : debug 00:02:49.525 default_library: static 00:02:49.525 libdir : /usr/local/lib 00:02:49.525 00:02:49.525 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:49.784 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:50.042 [1/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:50.042 [2/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:50.042 [3/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:50.042 [4/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:50.042 [5/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:50.042 [6/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:50.042 [7/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:50.042 [8/36] Compiling C object samples/null.p/null.c.o 00:02:50.042 [9/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:50.042 [10/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:50.042 [11/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:50.042 [12/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:50.042 [13/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:50.042 [14/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:50.042 [15/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:50.042 [16/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:50.042 [17/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:50.042 [18/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:50.042 [19/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:50.042 [20/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:50.042 [21/36] Compiling C object samples/server.p/server.c.o 00:02:50.042 [22/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:50.042 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:50.042 [24/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:50.042 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:50.042 [26/36] Compiling C object samples/client.p/client.c.o 00:02:50.042 [27/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:50.042 [28/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:50.042 [29/36] Linking target samples/client 00:02:50.042 [30/36] Linking static target lib/libvfio-user.a 00:02:50.042 [31/36] Linking target test/unit_tests 00:02:50.042 [32/36] Linking target samples/lspci 00:02:50.042 [33/36] Linking target samples/null 00:02:50.042 [34/36] Linking target samples/server 00:02:50.042 [35/36] Linking target samples/gpio-pci-idio-16 00:02:50.042 [36/36] Linking target samples/shadow_ioeventfd_server 00:02:50.042 INFO: autodetecting backend as ninja 00:02:50.042 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:50.042 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:50.610 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:50.610 ninja: no work to do. 00:02:53.899 CC lib/ut/ut.o 00:02:53.899 CC lib/ut_mock/mock.o 00:02:53.899 CC lib/log/log.o 00:02:53.899 CC lib/log/log_deprecated.o 00:02:53.899 CC lib/log/log_flags.o 00:02:53.899 LIB libspdk_ut.a 00:02:53.899 LIB libspdk_ut_mock.a 00:02:53.899 LIB libspdk_log.a 00:02:53.899 CXX lib/trace_parser/trace.o 00:02:53.899 CC lib/ioat/ioat.o 00:02:53.899 CC lib/dma/dma.o 00:02:53.899 CC lib/util/base64.o 00:02:53.899 CC lib/util/bit_array.o 00:02:53.899 CC lib/util/crc32.o 00:02:53.899 CC lib/util/cpuset.o 00:02:53.899 CC lib/util/crc16.o 00:02:53.899 CC lib/util/crc32_ieee.o 00:02:53.899 CC lib/util/crc32c.o 00:02:53.899 CC lib/util/dif.o 00:02:53.899 CC lib/util/crc64.o 00:02:53.899 CC lib/util/fd.o 00:02:53.899 CC lib/util/file.o 00:02:53.899 CC lib/util/hexlify.o 00:02:53.899 CC lib/util/iov.o 00:02:53.899 CC lib/util/math.o 00:02:53.899 CC lib/util/string.o 00:02:53.899 CC lib/util/pipe.o 00:02:53.899 CC lib/util/strerror_tls.o 00:02:53.899 CC lib/util/uuid.o 00:02:53.899 CC lib/util/fd_group.o 00:02:53.899 CC lib/util/xor.o 00:02:53.899 CC lib/util/zipf.o 00:02:54.158 CC lib/vfio_user/host/vfio_user_pci.o 00:02:54.158 CC lib/vfio_user/host/vfio_user.o 00:02:54.158 LIB libspdk_dma.a 00:02:54.158 LIB libspdk_ioat.a 00:02:54.158 LIB libspdk_vfio_user.a 00:02:54.417 LIB libspdk_util.a 00:02:54.417 LIB libspdk_trace_parser.a 00:02:54.677 CC lib/vmd/led.o 00:02:54.677 CC lib/env_dpdk/env.o 00:02:54.677 CC lib/vmd/vmd.o 00:02:54.677 CC lib/env_dpdk/memory.o 00:02:54.677 CC lib/env_dpdk/threads.o 00:02:54.677 CC lib/env_dpdk/pci.o 00:02:54.677 CC lib/env_dpdk/pci_ioat.o 00:02:54.677 CC lib/env_dpdk/init.o 00:02:54.677 CC lib/env_dpdk/pci_virtio.o 00:02:54.677 CC lib/env_dpdk/pci_vmd.o 00:02:54.677 CC lib/env_dpdk/pci_idxd.o 00:02:54.677 CC lib/env_dpdk/pci_dpdk.o 00:02:54.677 CC lib/env_dpdk/pci_event.o 00:02:54.677 CC lib/env_dpdk/sigbus_handler.o 00:02:54.677 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:54.677 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:54.677 CC lib/json/json_write.o 00:02:54.677 CC lib/json/json_parse.o 00:02:54.677 CC lib/json/json_util.o 00:02:54.677 CC lib/rdma/rdma_verbs.o 00:02:54.677 CC lib/rdma/common.o 00:02:54.677 CC lib/conf/conf.o 00:02:54.677 CC lib/idxd/idxd_kernel.o 00:02:54.677 CC lib/idxd/idxd.o 00:02:54.677 CC lib/idxd/idxd_user.o 00:02:54.677 LIB libspdk_conf.a 00:02:54.936 LIB libspdk_json.a 00:02:54.936 LIB libspdk_rdma.a 00:02:54.936 LIB libspdk_idxd.a 00:02:54.936 LIB libspdk_vmd.a 00:02:55.195 CC lib/jsonrpc/jsonrpc_server.o 00:02:55.195 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:55.195 CC lib/jsonrpc/jsonrpc_client.o 00:02:55.195 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:55.195 LIB libspdk_jsonrpc.a 00:02:55.454 LIB libspdk_env_dpdk.a 00:02:55.454 CC lib/rpc/rpc.o 00:02:55.714 LIB libspdk_rpc.a 00:02:55.973 CC lib/sock/sock.o 00:02:55.973 CC lib/sock/sock_rpc.o 00:02:55.973 CC lib/trace/trace_rpc.o 00:02:55.973 CC lib/trace/trace.o 00:02:55.973 CC lib/trace/trace_flags.o 00:02:55.973 CC lib/notify/notify.o 00:02:55.973 CC lib/notify/notify_rpc.o 00:02:56.233 LIB libspdk_notify.a 00:02:56.233 LIB libspdk_trace.a 00:02:56.233 LIB libspdk_sock.a 00:02:56.493 CC lib/thread/thread.o 00:02:56.493 CC lib/thread/iobuf.o 00:02:56.493 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:56.493 CC lib/nvme/nvme_ctrlr.o 00:02:56.493 CC lib/nvme/nvme_ns_cmd.o 00:02:56.493 CC lib/nvme/nvme_fabric.o 00:02:56.493 CC lib/nvme/nvme_ns.o 00:02:56.493 CC lib/nvme/nvme_pcie_common.o 00:02:56.493 CC lib/nvme/nvme_pcie.o 00:02:56.493 CC lib/nvme/nvme_qpair.o 00:02:56.493 CC lib/nvme/nvme_discovery.o 00:02:56.493 CC lib/nvme/nvme_quirks.o 00:02:56.493 CC lib/nvme/nvme.o 00:02:56.493 CC lib/nvme/nvme_transport.o 00:02:56.493 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:56.493 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:56.493 CC lib/nvme/nvme_tcp.o 00:02:56.493 CC lib/nvme/nvme_opal.o 00:02:56.493 CC lib/nvme/nvme_io_msg.o 00:02:56.493 CC lib/nvme/nvme_poll_group.o 00:02:56.493 CC lib/nvme/nvme_zns.o 00:02:56.493 CC lib/nvme/nvme_cuse.o 00:02:56.493 CC lib/nvme/nvme_vfio_user.o 00:02:56.493 CC lib/nvme/nvme_rdma.o 00:02:57.432 LIB libspdk_thread.a 00:02:57.432 CC lib/blob/blobstore.o 00:02:57.432 CC lib/blob/request.o 00:02:57.432 CC lib/blob/zeroes.o 00:02:57.432 CC lib/blob/blob_bs_dev.o 00:02:57.432 CC lib/virtio/virtio_vfio_user.o 00:02:57.432 CC lib/virtio/virtio.o 00:02:57.432 CC lib/virtio/virtio_vhost_user.o 00:02:57.432 CC lib/virtio/virtio_pci.o 00:02:57.432 CC lib/vfu_tgt/tgt_rpc.o 00:02:57.432 CC lib/vfu_tgt/tgt_endpoint.o 00:02:57.432 CC lib/accel/accel.o 00:02:57.432 CC lib/accel/accel_rpc.o 00:02:57.432 CC lib/accel/accel_sw.o 00:02:57.692 CC lib/init/json_config.o 00:02:57.692 CC lib/init/subsystem.o 00:02:57.692 CC lib/init/rpc.o 00:02:57.692 CC lib/init/subsystem_rpc.o 00:02:57.692 LIB libspdk_nvme.a 00:02:57.692 LIB libspdk_init.a 00:02:57.692 LIB libspdk_virtio.a 00:02:57.692 LIB libspdk_vfu_tgt.a 00:02:57.951 CC lib/event/app.o 00:02:57.951 CC lib/event/reactor.o 00:02:57.951 CC lib/event/scheduler_static.o 00:02:57.951 CC lib/event/log_rpc.o 00:02:57.951 CC lib/event/app_rpc.o 00:02:58.211 LIB libspdk_accel.a 00:02:58.211 LIB libspdk_event.a 00:02:58.470 CC lib/bdev/bdev.o 00:02:58.470 CC lib/bdev/part.o 00:02:58.470 CC lib/bdev/bdev_rpc.o 00:02:58.470 CC lib/bdev/bdev_zone.o 00:02:58.470 CC lib/bdev/scsi_nvme.o 00:02:59.039 LIB libspdk_blob.a 00:02:59.298 CC lib/lvol/lvol.o 00:02:59.298 CC lib/blobfs/blobfs.o 00:02:59.298 CC lib/blobfs/tree.o 00:02:59.867 LIB libspdk_lvol.a 00:02:59.867 LIB libspdk_blobfs.a 00:03:00.127 LIB libspdk_bdev.a 00:03:00.385 CC lib/nbd/nbd.o 00:03:00.385 CC lib/nbd/nbd_rpc.o 00:03:00.385 CC lib/ublk/ublk_rpc.o 00:03:00.385 CC lib/ublk/ublk.o 00:03:00.385 CC lib/ftl/ftl_core.o 00:03:00.385 CC lib/ftl/ftl_init.o 00:03:00.385 CC lib/ftl/ftl_io.o 00:03:00.385 CC lib/ftl/ftl_layout.o 00:03:00.386 CC lib/ftl/ftl_sb.o 00:03:00.386 CC lib/ftl/ftl_debug.o 00:03:00.386 CC lib/ftl/ftl_l2p.o 00:03:00.386 CC lib/ftl/ftl_l2p_flat.o 00:03:00.386 CC lib/ftl/ftl_nv_cache.o 00:03:00.386 CC lib/ftl/ftl_band_ops.o 00:03:00.386 CC lib/ftl/ftl_band.o 00:03:00.386 CC lib/ftl/ftl_writer.o 00:03:00.386 CC lib/ftl/ftl_rq.o 00:03:00.386 CC lib/scsi/dev.o 00:03:00.386 CC lib/ftl/ftl_reloc.o 00:03:00.386 CC lib/ftl/ftl_l2p_cache.o 00:03:00.386 CC lib/scsi/lun.o 00:03:00.386 CC lib/ftl/ftl_p2l.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt.o 00:03:00.386 CC lib/scsi/scsi_bdev.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:00.386 CC lib/scsi/port.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:00.386 CC lib/scsi/scsi.o 00:03:00.386 CC lib/nvmf/ctrlr.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:00.386 CC lib/scsi/scsi_pr.o 00:03:00.386 CC lib/nvmf/ctrlr_discovery.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:00.386 CC lib/scsi/scsi_rpc.o 00:03:00.386 CC lib/nvmf/ctrlr_bdev.o 00:03:00.386 CC lib/scsi/task.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:00.386 CC lib/nvmf/subsystem.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:00.386 CC lib/nvmf/nvmf.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:00.386 CC lib/nvmf/nvmf_rpc.o 00:03:00.386 CC lib/nvmf/transport.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:00.386 CC lib/nvmf/tcp.o 00:03:00.386 CC lib/nvmf/vfio_user.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:00.386 CC lib/nvmf/rdma.o 00:03:00.386 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:00.386 CC lib/ftl/utils/ftl_conf.o 00:03:00.386 CC lib/ftl/utils/ftl_md.o 00:03:00.386 CC lib/ftl/utils/ftl_mempool.o 00:03:00.386 CC lib/ftl/utils/ftl_bitmap.o 00:03:00.386 CC lib/ftl/utils/ftl_property.o 00:03:00.386 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:00.386 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:00.386 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:00.386 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:00.386 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:00.386 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:00.386 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:00.386 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:00.386 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:00.386 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:00.386 CC lib/ftl/base/ftl_base_dev.o 00:03:00.386 CC lib/ftl/base/ftl_base_bdev.o 00:03:00.386 CC lib/ftl/ftl_trace.o 00:03:00.645 LIB libspdk_nbd.a 00:03:00.904 LIB libspdk_scsi.a 00:03:00.904 LIB libspdk_ublk.a 00:03:00.904 LIB libspdk_ftl.a 00:03:01.163 CC lib/vhost/vhost.o 00:03:01.163 CC lib/vhost/vhost_rpc.o 00:03:01.163 CC lib/vhost/vhost_scsi.o 00:03:01.163 CC lib/vhost/vhost_blk.o 00:03:01.163 CC lib/vhost/rte_vhost_user.o 00:03:01.163 CC lib/iscsi/iscsi.o 00:03:01.163 CC lib/iscsi/conn.o 00:03:01.163 CC lib/iscsi/init_grp.o 00:03:01.163 CC lib/iscsi/param.o 00:03:01.163 CC lib/iscsi/md5.o 00:03:01.163 CC lib/iscsi/iscsi_subsystem.o 00:03:01.163 CC lib/iscsi/portal_grp.o 00:03:01.163 CC lib/iscsi/tgt_node.o 00:03:01.163 CC lib/iscsi/iscsi_rpc.o 00:03:01.163 CC lib/iscsi/task.o 00:03:01.734 LIB libspdk_nvmf.a 00:03:01.734 LIB libspdk_vhost.a 00:03:01.734 LIB libspdk_iscsi.a 00:03:02.302 CC module/vfu_device/vfu_virtio_blk.o 00:03:02.302 CC module/vfu_device/vfu_virtio.o 00:03:02.302 CC module/vfu_device/vfu_virtio_scsi.o 00:03:02.302 CC module/vfu_device/vfu_virtio_rpc.o 00:03:02.302 CC module/env_dpdk/env_dpdk_rpc.o 00:03:02.302 CC module/blob/bdev/blob_bdev.o 00:03:02.302 CC module/scheduler/gscheduler/gscheduler.o 00:03:02.302 CC module/accel/ioat/accel_ioat.o 00:03:02.302 CC module/accel/ioat/accel_ioat_rpc.o 00:03:02.302 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:02.302 CC module/accel/dsa/accel_dsa.o 00:03:02.302 CC module/accel/dsa/accel_dsa_rpc.o 00:03:02.302 CC module/sock/posix/posix.o 00:03:02.302 LIB libspdk_env_dpdk_rpc.a 00:03:02.302 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:02.302 CC module/accel/iaa/accel_iaa.o 00:03:02.302 CC module/accel/iaa/accel_iaa_rpc.o 00:03:02.302 CC module/accel/error/accel_error.o 00:03:02.302 CC module/accel/error/accel_error_rpc.o 00:03:02.562 LIB libspdk_scheduler_gscheduler.a 00:03:02.562 LIB libspdk_scheduler_dpdk_governor.a 00:03:02.562 LIB libspdk_accel_ioat.a 00:03:02.562 LIB libspdk_scheduler_dynamic.a 00:03:02.562 LIB libspdk_accel_error.a 00:03:02.562 LIB libspdk_blob_bdev.a 00:03:02.562 LIB libspdk_accel_iaa.a 00:03:02.562 LIB libspdk_accel_dsa.a 00:03:02.562 LIB libspdk_vfu_device.a 00:03:02.821 LIB libspdk_sock_posix.a 00:03:03.081 CC module/bdev/nvme/bdev_nvme.o 00:03:03.081 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:03.081 CC module/bdev/nvme/vbdev_opal.o 00:03:03.081 CC module/bdev/nvme/nvme_rpc.o 00:03:03.081 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:03.081 CC module/bdev/nvme/bdev_mdns_client.o 00:03:03.081 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:03.081 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:03.081 CC module/blobfs/bdev/blobfs_bdev.o 00:03:03.081 CC module/bdev/delay/vbdev_delay.o 00:03:03.081 CC module/bdev/lvol/vbdev_lvol.o 00:03:03.081 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:03.081 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:03.081 CC module/bdev/gpt/gpt.o 00:03:03.081 CC module/bdev/gpt/vbdev_gpt.o 00:03:03.081 CC module/bdev/error/vbdev_error.o 00:03:03.081 CC module/bdev/passthru/vbdev_passthru.o 00:03:03.081 CC module/bdev/error/vbdev_error_rpc.o 00:03:03.081 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:03.081 CC module/bdev/aio/bdev_aio.o 00:03:03.081 CC module/bdev/aio/bdev_aio_rpc.o 00:03:03.081 CC module/bdev/raid/bdev_raid_rpc.o 00:03:03.081 CC module/bdev/raid/bdev_raid.o 00:03:03.081 CC module/bdev/null/bdev_null.o 00:03:03.081 CC module/bdev/raid/bdev_raid_sb.o 00:03:03.081 CC module/bdev/ftl/bdev_ftl.o 00:03:03.081 CC module/bdev/null/bdev_null_rpc.o 00:03:03.081 CC module/bdev/raid/raid0.o 00:03:03.081 CC module/bdev/raid/raid1.o 00:03:03.081 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:03.081 CC module/bdev/iscsi/bdev_iscsi.o 00:03:03.081 CC module/bdev/raid/concat.o 00:03:03.081 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:03.081 CC module/bdev/malloc/bdev_malloc.o 00:03:03.081 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:03.081 CC module/bdev/split/vbdev_split.o 00:03:03.081 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:03.081 CC module/bdev/split/vbdev_split_rpc.o 00:03:03.081 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:03.081 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:03.081 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:03.081 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:03.081 LIB libspdk_blobfs_bdev.a 00:03:03.081 LIB libspdk_bdev_split.a 00:03:03.081 LIB libspdk_bdev_gpt.a 00:03:03.081 LIB libspdk_bdev_error.a 00:03:03.081 LIB libspdk_bdev_null.a 00:03:03.081 LIB libspdk_bdev_passthru.a 00:03:03.081 LIB libspdk_bdev_ftl.a 00:03:03.081 LIB libspdk_bdev_aio.a 00:03:03.341 LIB libspdk_bdev_delay.a 00:03:03.341 LIB libspdk_bdev_iscsi.a 00:03:03.341 LIB libspdk_bdev_zone_block.a 00:03:03.341 LIB libspdk_bdev_malloc.a 00:03:03.341 LIB libspdk_bdev_lvol.a 00:03:03.341 LIB libspdk_bdev_virtio.a 00:03:03.601 LIB libspdk_bdev_raid.a 00:03:04.187 LIB libspdk_bdev_nvme.a 00:03:04.755 CC module/event/subsystems/sock/sock.o 00:03:04.755 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:04.755 CC module/event/subsystems/scheduler/scheduler.o 00:03:04.755 CC module/event/subsystems/iobuf/iobuf.o 00:03:04.755 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:04.755 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:04.755 CC module/event/subsystems/vmd/vmd.o 00:03:04.755 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:04.755 LIB libspdk_event_sock.a 00:03:04.755 LIB libspdk_event_vhost_blk.a 00:03:04.755 LIB libspdk_event_scheduler.a 00:03:04.755 LIB libspdk_event_vfu_tgt.a 00:03:04.755 LIB libspdk_event_vmd.a 00:03:04.755 LIB libspdk_event_iobuf.a 00:03:05.324 CC module/event/subsystems/accel/accel.o 00:03:05.324 LIB libspdk_event_accel.a 00:03:05.583 CC module/event/subsystems/bdev/bdev.o 00:03:05.583 LIB libspdk_event_bdev.a 00:03:05.842 CC module/event/subsystems/scsi/scsi.o 00:03:05.842 CC module/event/subsystems/ublk/ublk.o 00:03:05.842 CC module/event/subsystems/nbd/nbd.o 00:03:06.102 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:06.102 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:06.102 LIB libspdk_event_ublk.a 00:03:06.102 LIB libspdk_event_scsi.a 00:03:06.102 LIB libspdk_event_nbd.a 00:03:06.102 LIB libspdk_event_nvmf.a 00:03:06.361 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:06.361 CC module/event/subsystems/iscsi/iscsi.o 00:03:06.361 LIB libspdk_event_vhost_scsi.a 00:03:06.361 LIB libspdk_event_iscsi.a 00:03:06.935 CC test/rpc_client/rpc_client_test.o 00:03:06.935 TEST_HEADER include/spdk/accel.h 00:03:06.935 TEST_HEADER include/spdk/accel_module.h 00:03:06.935 TEST_HEADER include/spdk/assert.h 00:03:06.935 TEST_HEADER include/spdk/base64.h 00:03:06.935 TEST_HEADER include/spdk/bdev.h 00:03:06.935 TEST_HEADER include/spdk/barrier.h 00:03:06.935 TEST_HEADER include/spdk/bdev_zone.h 00:03:06.935 TEST_HEADER include/spdk/blob_bdev.h 00:03:06.935 TEST_HEADER include/spdk/bdev_module.h 00:03:06.935 TEST_HEADER include/spdk/bit_array.h 00:03:06.935 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:06.935 TEST_HEADER include/spdk/bit_pool.h 00:03:06.935 TEST_HEADER include/spdk/blobfs.h 00:03:06.935 TEST_HEADER include/spdk/blob.h 00:03:06.935 CC app/spdk_lspci/spdk_lspci.o 00:03:06.935 TEST_HEADER include/spdk/conf.h 00:03:06.935 TEST_HEADER include/spdk/config.h 00:03:06.935 TEST_HEADER include/spdk/cpuset.h 00:03:06.935 TEST_HEADER include/spdk/crc16.h 00:03:06.935 TEST_HEADER include/spdk/crc32.h 00:03:06.935 TEST_HEADER include/spdk/dif.h 00:03:06.935 TEST_HEADER include/spdk/crc64.h 00:03:06.935 TEST_HEADER include/spdk/dma.h 00:03:06.935 TEST_HEADER include/spdk/endian.h 00:03:06.935 TEST_HEADER include/spdk/env.h 00:03:06.935 TEST_HEADER include/spdk/env_dpdk.h 00:03:06.935 TEST_HEADER include/spdk/event.h 00:03:06.935 TEST_HEADER include/spdk/fd_group.h 00:03:06.935 TEST_HEADER include/spdk/fd.h 00:03:06.935 TEST_HEADER include/spdk/file.h 00:03:06.935 TEST_HEADER include/spdk/ftl.h 00:03:06.935 CC app/trace_record/trace_record.o 00:03:06.935 TEST_HEADER include/spdk/gpt_spec.h 00:03:06.935 CC app/spdk_top/spdk_top.o 00:03:06.935 TEST_HEADER include/spdk/hexlify.h 00:03:06.935 TEST_HEADER include/spdk/histogram_data.h 00:03:06.935 CC app/spdk_nvme_discover/discovery_aer.o 00:03:06.935 TEST_HEADER include/spdk/init.h 00:03:06.935 TEST_HEADER include/spdk/idxd.h 00:03:06.935 TEST_HEADER include/spdk/idxd_spec.h 00:03:06.935 TEST_HEADER include/spdk/ioat.h 00:03:06.935 TEST_HEADER include/spdk/iscsi_spec.h 00:03:06.935 TEST_HEADER include/spdk/ioat_spec.h 00:03:06.936 TEST_HEADER include/spdk/json.h 00:03:06.936 CC app/spdk_nvme_perf/perf.o 00:03:06.936 TEST_HEADER include/spdk/jsonrpc.h 00:03:06.936 CXX app/trace/trace.o 00:03:06.936 TEST_HEADER include/spdk/likely.h 00:03:06.936 TEST_HEADER include/spdk/lvol.h 00:03:06.936 TEST_HEADER include/spdk/log.h 00:03:06.936 TEST_HEADER include/spdk/memory.h 00:03:06.936 TEST_HEADER include/spdk/mmio.h 00:03:06.936 TEST_HEADER include/spdk/nbd.h 00:03:06.936 TEST_HEADER include/spdk/notify.h 00:03:06.936 TEST_HEADER include/spdk/nvme.h 00:03:06.936 TEST_HEADER include/spdk/nvme_intel.h 00:03:06.936 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:06.936 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:06.936 TEST_HEADER include/spdk/nvme_spec.h 00:03:06.936 TEST_HEADER include/spdk/nvme_zns.h 00:03:06.936 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:06.936 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:06.936 TEST_HEADER include/spdk/nvmf.h 00:03:06.936 CC app/spdk_nvme_identify/identify.o 00:03:06.936 TEST_HEADER include/spdk/nvmf_spec.h 00:03:06.936 TEST_HEADER include/spdk/nvmf_transport.h 00:03:06.936 TEST_HEADER include/spdk/opal_spec.h 00:03:06.936 TEST_HEADER include/spdk/opal.h 00:03:06.936 TEST_HEADER include/spdk/pci_ids.h 00:03:06.936 TEST_HEADER include/spdk/pipe.h 00:03:06.936 TEST_HEADER include/spdk/queue.h 00:03:06.936 TEST_HEADER include/spdk/reduce.h 00:03:06.936 TEST_HEADER include/spdk/rpc.h 00:03:06.936 TEST_HEADER include/spdk/scheduler.h 00:03:06.936 TEST_HEADER include/spdk/scsi.h 00:03:06.936 TEST_HEADER include/spdk/scsi_spec.h 00:03:06.936 TEST_HEADER include/spdk/sock.h 00:03:06.936 TEST_HEADER include/spdk/string.h 00:03:06.936 TEST_HEADER include/spdk/stdinc.h 00:03:06.936 TEST_HEADER include/spdk/thread.h 00:03:06.936 TEST_HEADER include/spdk/trace_parser.h 00:03:06.936 TEST_HEADER include/spdk/trace.h 00:03:06.936 TEST_HEADER include/spdk/ublk.h 00:03:06.936 TEST_HEADER include/spdk/util.h 00:03:06.936 TEST_HEADER include/spdk/tree.h 00:03:06.936 TEST_HEADER include/spdk/uuid.h 00:03:06.936 TEST_HEADER include/spdk/version.h 00:03:06.936 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:06.936 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:06.936 TEST_HEADER include/spdk/vmd.h 00:03:06.936 TEST_HEADER include/spdk/vhost.h 00:03:06.936 TEST_HEADER include/spdk/xor.h 00:03:06.936 TEST_HEADER include/spdk/zipf.h 00:03:06.936 CXX test/cpp_headers/accel.o 00:03:06.936 CXX test/cpp_headers/assert.o 00:03:06.936 CXX test/cpp_headers/accel_module.o 00:03:06.936 CXX test/cpp_headers/barrier.o 00:03:06.936 CXX test/cpp_headers/base64.o 00:03:06.936 CXX test/cpp_headers/bdev.o 00:03:06.936 CXX test/cpp_headers/bdev_zone.o 00:03:06.936 CC app/nvmf_tgt/nvmf_main.o 00:03:06.936 CC app/vhost/vhost.o 00:03:06.936 CXX test/cpp_headers/bdev_module.o 00:03:06.936 CXX test/cpp_headers/bit_array.o 00:03:06.936 CXX test/cpp_headers/bit_pool.o 00:03:06.936 CXX test/cpp_headers/blob_bdev.o 00:03:06.936 CXX test/cpp_headers/blobfs_bdev.o 00:03:06.936 CXX test/cpp_headers/blob.o 00:03:06.936 CXX test/cpp_headers/blobfs.o 00:03:06.936 CC app/spdk_dd/spdk_dd.o 00:03:06.936 CXX test/cpp_headers/conf.o 00:03:06.936 CXX test/cpp_headers/config.o 00:03:06.936 CXX test/cpp_headers/cpuset.o 00:03:06.936 CXX test/cpp_headers/crc16.o 00:03:06.936 CXX test/cpp_headers/crc32.o 00:03:06.936 CXX test/cpp_headers/crc64.o 00:03:06.936 CXX test/cpp_headers/dif.o 00:03:06.936 CXX test/cpp_headers/dma.o 00:03:06.936 CXX test/cpp_headers/endian.o 00:03:06.936 CXX test/cpp_headers/env_dpdk.o 00:03:06.936 CXX test/cpp_headers/env.o 00:03:06.936 CXX test/cpp_headers/event.o 00:03:06.936 CXX test/cpp_headers/fd_group.o 00:03:06.936 CC app/iscsi_tgt/iscsi_tgt.o 00:03:06.936 CXX test/cpp_headers/fd.o 00:03:06.936 CXX test/cpp_headers/file.o 00:03:06.936 CXX test/cpp_headers/gpt_spec.o 00:03:06.936 CXX test/cpp_headers/ftl.o 00:03:06.936 CXX test/cpp_headers/hexlify.o 00:03:06.936 CXX test/cpp_headers/idxd.o 00:03:06.936 CXX test/cpp_headers/histogram_data.o 00:03:06.936 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:06.936 CXX test/cpp_headers/idxd_spec.o 00:03:06.936 CXX test/cpp_headers/init.o 00:03:06.936 CC app/spdk_tgt/spdk_tgt.o 00:03:06.936 CC test/event/event_perf/event_perf.o 00:03:06.936 CC test/event/reactor_perf/reactor_perf.o 00:03:06.936 CC test/env/pci/pci_ut.o 00:03:06.936 CC test/event/reactor/reactor.o 00:03:06.936 CC test/app/jsoncat/jsoncat.o 00:03:06.936 CC test/app/histogram_perf/histogram_perf.o 00:03:06.936 CC examples/vmd/lsvmd/lsvmd.o 00:03:06.936 CC test/env/vtophys/vtophys.o 00:03:06.936 CC test/thread/lock/spdk_lock.o 00:03:06.936 CXX test/cpp_headers/ioat.o 00:03:06.936 CC test/nvme/connect_stress/connect_stress.o 00:03:06.936 CC examples/ioat/perf/perf.o 00:03:06.936 CC test/nvme/simple_copy/simple_copy.o 00:03:06.936 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:06.936 CC test/nvme/boot_partition/boot_partition.o 00:03:06.936 CC test/env/memory/memory_ut.o 00:03:06.936 CC test/nvme/aer/aer.o 00:03:06.936 CC test/app/stub/stub.o 00:03:06.936 CC test/nvme/compliance/nvme_compliance.o 00:03:06.936 CC test/nvme/e2edp/nvme_dp.o 00:03:06.936 CC test/thread/poller_perf/poller_perf.o 00:03:06.936 CC test/nvme/reset/reset.o 00:03:06.936 CC test/nvme/reserve/reserve.o 00:03:06.936 CC test/nvme/sgl/sgl.o 00:03:06.936 CC examples/vmd/led/led.o 00:03:06.936 CC test/nvme/cuse/cuse.o 00:03:06.936 CC examples/accel/perf/accel_perf.o 00:03:06.936 CC test/nvme/startup/startup.o 00:03:06.936 CC examples/nvme/reconnect/reconnect.o 00:03:06.936 CC test/nvme/overhead/overhead.o 00:03:06.936 CC test/nvme/fdp/fdp.o 00:03:06.936 CC test/nvme/err_injection/err_injection.o 00:03:06.936 CC test/event/app_repeat/app_repeat.o 00:03:06.936 CC examples/util/zipf/zipf.o 00:03:06.936 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:06.936 CC examples/sock/hello_world/hello_sock.o 00:03:06.936 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:06.936 CC test/nvme/fused_ordering/fused_ordering.o 00:03:06.936 CC examples/nvme/hotplug/hotplug.o 00:03:06.936 CC examples/ioat/verify/verify.o 00:03:06.936 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:06.936 CC examples/nvme/hello_world/hello_world.o 00:03:06.936 CC examples/nvme/arbitration/arbitration.o 00:03:06.936 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:06.936 CC examples/nvme/abort/abort.o 00:03:06.936 CC test/event/scheduler/scheduler.o 00:03:06.936 CC app/fio/nvme/fio_plugin.o 00:03:06.936 CC examples/idxd/perf/perf.o 00:03:06.936 CC test/accel/dif/dif.o 00:03:06.936 CC test/bdev/bdevio/bdevio.o 00:03:06.936 LINK spdk_lspci 00:03:06.936 CC test/dma/test_dma/test_dma.o 00:03:06.936 CC test/blobfs/mkfs/mkfs.o 00:03:06.936 CC examples/bdev/hello_world/hello_bdev.o 00:03:06.936 CC test/app/bdev_svc/bdev_svc.o 00:03:06.936 CC examples/blob/hello_world/hello_blob.o 00:03:06.936 CC examples/blob/cli/blobcli.o 00:03:06.936 CC examples/thread/thread/thread_ex.o 00:03:06.936 CC examples/nvmf/nvmf/nvmf.o 00:03:06.936 CC examples/bdev/bdevperf/bdevperf.o 00:03:06.936 LINK rpc_client_test 00:03:06.936 CC app/fio/bdev/fio_plugin.o 00:03:06.936 CC test/env/mem_callbacks/mem_callbacks.o 00:03:06.936 CC test/lvol/esnap/esnap.o 00:03:06.936 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:07.197 LINK spdk_nvme_discover 00:03:07.197 LINK spdk_trace_record 00:03:07.197 CXX test/cpp_headers/ioat_spec.o 00:03:07.197 CXX test/cpp_headers/iscsi_spec.o 00:03:07.197 CXX test/cpp_headers/json.o 00:03:07.197 CXX test/cpp_headers/jsonrpc.o 00:03:07.197 CXX test/cpp_headers/likely.o 00:03:07.197 LINK histogram_perf 00:03:07.197 CXX test/cpp_headers/log.o 00:03:07.197 LINK lsvmd 00:03:07.197 CXX test/cpp_headers/lvol.o 00:03:07.197 CXX test/cpp_headers/memory.o 00:03:07.197 CXX test/cpp_headers/mmio.o 00:03:07.197 LINK event_perf 00:03:07.197 CXX test/cpp_headers/nbd.o 00:03:07.197 CXX test/cpp_headers/notify.o 00:03:07.197 LINK reactor_perf 00:03:07.197 LINK jsoncat 00:03:07.197 CXX test/cpp_headers/nvme.o 00:03:07.197 CXX test/cpp_headers/nvme_intel.o 00:03:07.197 CXX test/cpp_headers/nvme_ocssd.o 00:03:07.197 LINK vtophys 00:03:07.197 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:07.197 CXX test/cpp_headers/nvme_spec.o 00:03:07.197 LINK reactor 00:03:07.197 CXX test/cpp_headers/nvme_zns.o 00:03:07.197 CXX test/cpp_headers/nvmf_cmd.o 00:03:07.197 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:07.197 CXX test/cpp_headers/nvmf.o 00:03:07.197 CXX test/cpp_headers/nvmf_spec.o 00:03:07.197 LINK nvmf_tgt 00:03:07.197 CXX test/cpp_headers/nvmf_transport.o 00:03:07.197 CXX test/cpp_headers/opal.o 00:03:07.197 CXX test/cpp_headers/opal_spec.o 00:03:07.197 CXX test/cpp_headers/pci_ids.o 00:03:07.197 CXX test/cpp_headers/pipe.o 00:03:07.197 CXX test/cpp_headers/queue.o 00:03:07.197 LINK led 00:03:07.197 LINK interrupt_tgt 00:03:07.197 LINK poller_perf 00:03:07.197 CXX test/cpp_headers/reduce.o 00:03:07.197 LINK vhost 00:03:07.197 LINK env_dpdk_post_init 00:03:07.197 CXX test/cpp_headers/rpc.o 00:03:07.197 LINK zipf 00:03:07.197 LINK app_repeat 00:03:07.197 CXX test/cpp_headers/scheduler.o 00:03:07.197 CXX test/cpp_headers/scsi.o 00:03:07.197 CXX test/cpp_headers/scsi_spec.o 00:03:07.197 LINK boot_partition 00:03:07.197 CXX test/cpp_headers/sock.o 00:03:07.197 LINK startup 00:03:07.197 LINK stub 00:03:07.197 LINK connect_stress 00:03:07.197 LINK iscsi_tgt 00:03:07.197 LINK spdk_tgt 00:03:07.197 LINK err_injection 00:03:07.197 LINK pmr_persistence 00:03:07.197 CXX test/cpp_headers/stdinc.o 00:03:07.197 LINK doorbell_aers 00:03:07.197 LINK reserve 00:03:07.197 LINK fused_ordering 00:03:07.197 LINK ioat_perf 00:03:07.197 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:07.197 LINK cmb_copy 00:03:07.197 LINK verify 00:03:07.197 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:07.197 LINK simple_copy 00:03:07.197 LINK hello_world 00:03:07.197 LINK mkfs 00:03:07.197 LINK scheduler 00:03:07.197 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:07.197 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:07.197 LINK bdev_svc 00:03:07.197 LINK hotplug 00:03:07.197 LINK hello_sock 00:03:07.197 LINK nvme_dp 00:03:07.197 LINK aer 00:03:07.197 LINK reset 00:03:07.197 LINK fdp 00:03:07.197 LINK mem_callbacks 00:03:07.458 LINK hello_bdev 00:03:07.458 LINK sgl 00:03:07.458 LINK overhead 00:03:07.458 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:07.458 CXX test/cpp_headers/string.o 00:03:07.458 CXX test/cpp_headers/thread.o 00:03:07.458 LINK hello_blob 00:03:07.458 CXX test/cpp_headers/trace.o 00:03:07.458 CXX test/cpp_headers/trace_parser.o 00:03:07.458 CXX test/cpp_headers/tree.o 00:03:07.458 CXX test/cpp_headers/ublk.o 00:03:07.458 CXX test/cpp_headers/util.o 00:03:07.458 CXX test/cpp_headers/uuid.o 00:03:07.458 LINK thread 00:03:07.458 CXX test/cpp_headers/version.o 00:03:07.458 CXX test/cpp_headers/vfio_user_pci.o 00:03:07.458 CXX test/cpp_headers/vfio_user_spec.o 00:03:07.458 CXX test/cpp_headers/vhost.o 00:03:07.458 CXX test/cpp_headers/vmd.o 00:03:07.458 CXX test/cpp_headers/xor.o 00:03:07.458 LINK spdk_trace 00:03:07.458 CXX test/cpp_headers/zipf.o 00:03:07.458 LINK idxd_perf 00:03:07.458 LINK reconnect 00:03:07.458 LINK nvmf 00:03:07.458 LINK arbitration 00:03:07.458 LINK abort 00:03:07.458 LINK dif 00:03:07.458 LINK pci_ut 00:03:07.458 LINK test_dma 00:03:07.458 LINK bdevio 00:03:07.458 LINK spdk_dd 00:03:07.458 LINK nvme_compliance 00:03:07.717 LINK accel_perf 00:03:07.717 LINK nvme_manage 00:03:07.717 LINK nvme_fuzz 00:03:07.717 LINK spdk_nvme 00:03:07.717 LINK spdk_bdev 00:03:07.717 LINK blobcli 00:03:07.717 LINK memory_ut 00:03:07.717 LINK llvm_vfio_fuzz 00:03:07.717 LINK spdk_nvme_perf 00:03:07.717 LINK vhost_fuzz 00:03:07.975 LINK bdevperf 00:03:07.975 LINK spdk_nvme_identify 00:03:07.975 LINK spdk_top 00:03:08.232 LINK cuse 00:03:08.232 LINK llvm_nvme_fuzz 00:03:08.490 LINK spdk_lock 00:03:09.057 LINK iscsi_fuzz 00:03:10.432 LINK esnap 00:03:10.690 00:03:10.690 real 0m23.223s 00:03:10.690 user 4m14.022s 00:03:10.690 sys 2m3.853s 00:03:10.690 05:28:21 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:03:10.690 05:28:21 -- common/autotest_common.sh@10 -- $ set +x 00:03:10.690 ************************************ 00:03:10.690 END TEST make 00:03:10.690 ************************************ 00:03:10.948 05:28:22 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:10.948 05:28:22 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:10.948 05:28:22 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:10.948 05:28:22 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:10.948 05:28:22 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:10.948 05:28:22 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:10.948 05:28:22 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:10.949 05:28:22 -- scripts/common.sh@335 -- # IFS=.-: 00:03:10.949 05:28:22 -- scripts/common.sh@335 -- # read -ra ver1 00:03:10.949 05:28:22 -- scripts/common.sh@336 -- # IFS=.-: 00:03:10.949 05:28:22 -- scripts/common.sh@336 -- # read -ra ver2 00:03:10.949 05:28:22 -- scripts/common.sh@337 -- # local 'op=<' 00:03:10.949 05:28:22 -- scripts/common.sh@339 -- # ver1_l=2 00:03:10.949 05:28:22 -- scripts/common.sh@340 -- # ver2_l=1 00:03:10.949 05:28:22 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:10.949 05:28:22 -- scripts/common.sh@343 -- # case "$op" in 00:03:10.949 05:28:22 -- scripts/common.sh@344 -- # : 1 00:03:10.949 05:28:22 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:10.949 05:28:22 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:10.949 05:28:22 -- scripts/common.sh@364 -- # decimal 1 00:03:10.949 05:28:22 -- scripts/common.sh@352 -- # local d=1 00:03:10.949 05:28:22 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:10.949 05:28:22 -- scripts/common.sh@354 -- # echo 1 00:03:10.949 05:28:22 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:10.949 05:28:22 -- scripts/common.sh@365 -- # decimal 2 00:03:10.949 05:28:22 -- scripts/common.sh@352 -- # local d=2 00:03:10.949 05:28:22 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:10.949 05:28:22 -- scripts/common.sh@354 -- # echo 2 00:03:10.949 05:28:22 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:10.949 05:28:22 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:10.949 05:28:22 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:10.949 05:28:22 -- scripts/common.sh@367 -- # return 0 00:03:10.949 05:28:22 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:10.949 05:28:22 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:10.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:10.949 --rc genhtml_branch_coverage=1 00:03:10.949 --rc genhtml_function_coverage=1 00:03:10.949 --rc genhtml_legend=1 00:03:10.949 --rc geninfo_all_blocks=1 00:03:10.949 --rc geninfo_unexecuted_blocks=1 00:03:10.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:10.949 ' 00:03:10.949 05:28:22 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:10.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:10.949 --rc genhtml_branch_coverage=1 00:03:10.949 --rc genhtml_function_coverage=1 00:03:10.949 --rc genhtml_legend=1 00:03:10.949 --rc geninfo_all_blocks=1 00:03:10.949 --rc geninfo_unexecuted_blocks=1 00:03:10.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:10.949 ' 00:03:10.949 05:28:22 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:10.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:10.949 --rc genhtml_branch_coverage=1 00:03:10.949 --rc genhtml_function_coverage=1 00:03:10.949 --rc genhtml_legend=1 00:03:10.949 --rc geninfo_all_blocks=1 00:03:10.949 --rc geninfo_unexecuted_blocks=1 00:03:10.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:10.949 ' 00:03:10.949 05:28:22 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:10.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:10.949 --rc genhtml_branch_coverage=1 00:03:10.949 --rc genhtml_function_coverage=1 00:03:10.949 --rc genhtml_legend=1 00:03:10.949 --rc geninfo_all_blocks=1 00:03:10.949 --rc geninfo_unexecuted_blocks=1 00:03:10.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:10.949 ' 00:03:10.949 05:28:22 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:10.949 05:28:22 -- nvmf/common.sh@7 -- # uname -s 00:03:10.949 05:28:22 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:10.949 05:28:22 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:10.949 05:28:22 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:10.949 05:28:22 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:10.949 05:28:22 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:10.949 05:28:22 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:10.949 05:28:22 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:10.949 05:28:22 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:10.949 05:28:22 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:10.949 05:28:22 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:10.949 05:28:22 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:10.949 05:28:22 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:10.949 05:28:22 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:10.949 05:28:22 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:10.949 05:28:22 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:10.949 05:28:22 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:10.949 05:28:22 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:10.949 05:28:22 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:10.949 05:28:22 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:10.949 05:28:22 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:10.949 05:28:22 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:10.949 05:28:22 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:10.949 05:28:22 -- paths/export.sh@5 -- # export PATH 00:03:10.949 05:28:22 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:10.949 05:28:22 -- nvmf/common.sh@46 -- # : 0 00:03:10.949 05:28:22 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:10.949 05:28:22 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:10.949 05:28:22 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:10.949 05:28:22 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:10.949 05:28:22 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:10.949 05:28:22 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:10.949 05:28:22 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:10.949 05:28:22 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:10.949 05:28:22 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:10.949 05:28:22 -- spdk/autotest.sh@32 -- # uname -s 00:03:10.949 05:28:22 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:10.949 05:28:22 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:10.949 05:28:22 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:10.949 05:28:22 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:10.949 05:28:22 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:10.949 05:28:22 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:10.949 05:28:22 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:10.949 05:28:22 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:10.949 05:28:22 -- spdk/autotest.sh@48 -- # udevadm_pid=2140038 00:03:10.949 05:28:22 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:10.949 05:28:22 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:10.949 05:28:22 -- spdk/autotest.sh@54 -- # echo 2140040 00:03:10.949 05:28:22 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:10.949 05:28:22 -- spdk/autotest.sh@56 -- # echo 2140041 00:03:10.949 05:28:22 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:10.949 05:28:22 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:10.949 05:28:22 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:10.949 05:28:22 -- spdk/autotest.sh@60 -- # echo 2140042 00:03:10.949 05:28:22 -- spdk/autotest.sh@62 -- # echo 2140044 00:03:10.949 05:28:22 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:10.949 05:28:22 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:10.949 05:28:22 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:10.949 05:28:22 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:10.949 05:28:22 -- common/autotest_common.sh@10 -- # set +x 00:03:10.949 05:28:22 -- spdk/autotest.sh@70 -- # create_test_list 00:03:10.949 05:28:22 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:10.949 05:28:22 -- common/autotest_common.sh@10 -- # set +x 00:03:10.949 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:10.949 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:10.949 05:28:22 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:10.949 05:28:22 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:10.949 05:28:22 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:10.949 05:28:22 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:10.949 05:28:22 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:10.949 05:28:22 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:11.208 05:28:22 -- common/autotest_common.sh@1450 -- # uname 00:03:11.208 05:28:22 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:03:11.208 05:28:22 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:11.208 05:28:22 -- common/autotest_common.sh@1470 -- # uname 00:03:11.208 05:28:22 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:03:11.208 05:28:22 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:03:11.208 05:28:22 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:03:11.208 lcov: LCOV version 1.15 00:03:11.208 05:28:22 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:03:12.774 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:03:13.072 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:03:13.072 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:03:25.282 05:28:36 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:03:25.282 05:28:36 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:25.282 05:28:36 -- common/autotest_common.sh@10 -- # set +x 00:03:25.282 05:28:36 -- spdk/autotest.sh@89 -- # rm -f 00:03:25.282 05:28:36 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:28.567 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:28.567 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:28.826 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:28.826 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:28.826 05:28:39 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:03:28.826 05:28:39 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:28.826 05:28:39 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:28.826 05:28:39 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:28.826 05:28:39 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:28.826 05:28:39 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:28.826 05:28:39 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:28.826 05:28:39 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:28.826 05:28:39 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:28.826 05:28:39 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:03:28.826 05:28:39 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:03:28.826 05:28:39 -- spdk/autotest.sh@108 -- # grep -v p 00:03:28.826 05:28:39 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:28.826 05:28:39 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:03:28.826 05:28:39 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:03:28.826 05:28:39 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:28.826 05:28:39 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:28.826 No valid GPT data, bailing 00:03:28.826 05:28:40 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:28.826 05:28:40 -- scripts/common.sh@393 -- # pt= 00:03:28.826 05:28:40 -- scripts/common.sh@394 -- # return 1 00:03:28.827 05:28:40 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:28.827 1+0 records in 00:03:28.827 1+0 records out 00:03:28.827 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00159307 s, 658 MB/s 00:03:28.827 05:28:40 -- spdk/autotest.sh@116 -- # sync 00:03:28.827 05:28:40 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:28.827 05:28:40 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:28.827 05:28:40 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:35.391 05:28:46 -- spdk/autotest.sh@122 -- # uname -s 00:03:35.391 05:28:46 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:03:35.391 05:28:46 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:35.391 05:28:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:35.391 05:28:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:35.391 05:28:46 -- common/autotest_common.sh@10 -- # set +x 00:03:35.391 ************************************ 00:03:35.391 START TEST setup.sh 00:03:35.391 ************************************ 00:03:35.391 05:28:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:35.391 * Looking for test storage... 00:03:35.391 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:35.391 05:28:46 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:35.391 05:28:46 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:35.391 05:28:46 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:35.650 05:28:46 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:35.650 05:28:46 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:35.650 05:28:46 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:35.650 05:28:46 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:35.650 05:28:46 -- scripts/common.sh@335 -- # IFS=.-: 00:03:35.650 05:28:46 -- scripts/common.sh@335 -- # read -ra ver1 00:03:35.650 05:28:46 -- scripts/common.sh@336 -- # IFS=.-: 00:03:35.650 05:28:46 -- scripts/common.sh@336 -- # read -ra ver2 00:03:35.650 05:28:46 -- scripts/common.sh@337 -- # local 'op=<' 00:03:35.650 05:28:46 -- scripts/common.sh@339 -- # ver1_l=2 00:03:35.650 05:28:46 -- scripts/common.sh@340 -- # ver2_l=1 00:03:35.650 05:28:46 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:35.650 05:28:46 -- scripts/common.sh@343 -- # case "$op" in 00:03:35.650 05:28:46 -- scripts/common.sh@344 -- # : 1 00:03:35.650 05:28:46 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:35.650 05:28:46 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:35.650 05:28:46 -- scripts/common.sh@364 -- # decimal 1 00:03:35.650 05:28:46 -- scripts/common.sh@352 -- # local d=1 00:03:35.650 05:28:46 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:35.650 05:28:46 -- scripts/common.sh@354 -- # echo 1 00:03:35.650 05:28:46 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:35.650 05:28:46 -- scripts/common.sh@365 -- # decimal 2 00:03:35.650 05:28:46 -- scripts/common.sh@352 -- # local d=2 00:03:35.650 05:28:46 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:35.650 05:28:46 -- scripts/common.sh@354 -- # echo 2 00:03:35.650 05:28:46 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:35.650 05:28:46 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:35.650 05:28:46 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:35.650 05:28:46 -- scripts/common.sh@367 -- # return 0 00:03:35.650 05:28:46 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:35.650 05:28:46 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:35.650 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:35.650 --rc genhtml_branch_coverage=1 00:03:35.650 --rc genhtml_function_coverage=1 00:03:35.650 --rc genhtml_legend=1 00:03:35.650 --rc geninfo_all_blocks=1 00:03:35.650 --rc geninfo_unexecuted_blocks=1 00:03:35.650 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:35.650 ' 00:03:35.650 05:28:46 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:35.650 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:35.650 --rc genhtml_branch_coverage=1 00:03:35.650 --rc genhtml_function_coverage=1 00:03:35.650 --rc genhtml_legend=1 00:03:35.650 --rc geninfo_all_blocks=1 00:03:35.650 --rc geninfo_unexecuted_blocks=1 00:03:35.650 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:35.650 ' 00:03:35.650 05:28:46 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:35.650 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:35.650 --rc genhtml_branch_coverage=1 00:03:35.650 --rc genhtml_function_coverage=1 00:03:35.650 --rc genhtml_legend=1 00:03:35.650 --rc geninfo_all_blocks=1 00:03:35.650 --rc geninfo_unexecuted_blocks=1 00:03:35.650 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:35.650 ' 00:03:35.650 05:28:46 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:35.650 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:35.650 --rc genhtml_branch_coverage=1 00:03:35.650 --rc genhtml_function_coverage=1 00:03:35.650 --rc genhtml_legend=1 00:03:35.650 --rc geninfo_all_blocks=1 00:03:35.650 --rc geninfo_unexecuted_blocks=1 00:03:35.650 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:35.650 ' 00:03:35.650 05:28:46 -- setup/test-setup.sh@10 -- # uname -s 00:03:35.650 05:28:46 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:35.650 05:28:46 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:35.650 05:28:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:35.650 05:28:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:35.650 05:28:46 -- common/autotest_common.sh@10 -- # set +x 00:03:35.650 ************************************ 00:03:35.650 START TEST acl 00:03:35.650 ************************************ 00:03:35.650 05:28:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:35.650 * Looking for test storage... 00:03:35.650 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:35.650 05:28:46 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:35.650 05:28:46 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:35.650 05:28:46 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:35.650 05:28:46 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:35.650 05:28:46 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:35.650 05:28:46 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:35.650 05:28:46 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:35.650 05:28:46 -- scripts/common.sh@335 -- # IFS=.-: 00:03:35.650 05:28:46 -- scripts/common.sh@335 -- # read -ra ver1 00:03:35.650 05:28:46 -- scripts/common.sh@336 -- # IFS=.-: 00:03:35.650 05:28:46 -- scripts/common.sh@336 -- # read -ra ver2 00:03:35.650 05:28:46 -- scripts/common.sh@337 -- # local 'op=<' 00:03:35.650 05:28:46 -- scripts/common.sh@339 -- # ver1_l=2 00:03:35.650 05:28:46 -- scripts/common.sh@340 -- # ver2_l=1 00:03:35.650 05:28:46 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:35.650 05:28:46 -- scripts/common.sh@343 -- # case "$op" in 00:03:35.650 05:28:46 -- scripts/common.sh@344 -- # : 1 00:03:35.650 05:28:46 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:35.651 05:28:46 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:35.651 05:28:46 -- scripts/common.sh@364 -- # decimal 1 00:03:35.651 05:28:46 -- scripts/common.sh@352 -- # local d=1 00:03:35.651 05:28:46 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:35.651 05:28:46 -- scripts/common.sh@354 -- # echo 1 00:03:35.651 05:28:46 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:35.651 05:28:46 -- scripts/common.sh@365 -- # decimal 2 00:03:35.651 05:28:46 -- scripts/common.sh@352 -- # local d=2 00:03:35.651 05:28:46 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:35.651 05:28:46 -- scripts/common.sh@354 -- # echo 2 00:03:35.651 05:28:46 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:35.651 05:28:46 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:35.651 05:28:46 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:35.651 05:28:46 -- scripts/common.sh@367 -- # return 0 00:03:35.651 05:28:46 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:35.651 05:28:46 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:35.651 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:35.651 --rc genhtml_branch_coverage=1 00:03:35.651 --rc genhtml_function_coverage=1 00:03:35.651 --rc genhtml_legend=1 00:03:35.651 --rc geninfo_all_blocks=1 00:03:35.651 --rc geninfo_unexecuted_blocks=1 00:03:35.651 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:35.651 ' 00:03:35.651 05:28:46 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:35.651 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:35.651 --rc genhtml_branch_coverage=1 00:03:35.651 --rc genhtml_function_coverage=1 00:03:35.651 --rc genhtml_legend=1 00:03:35.651 --rc geninfo_all_blocks=1 00:03:35.651 --rc geninfo_unexecuted_blocks=1 00:03:35.651 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:35.651 ' 00:03:35.651 05:28:46 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:35.651 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:35.651 --rc genhtml_branch_coverage=1 00:03:35.651 --rc genhtml_function_coverage=1 00:03:35.651 --rc genhtml_legend=1 00:03:35.651 --rc geninfo_all_blocks=1 00:03:35.651 --rc geninfo_unexecuted_blocks=1 00:03:35.651 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:35.651 ' 00:03:35.651 05:28:46 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:35.651 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:35.651 --rc genhtml_branch_coverage=1 00:03:35.651 --rc genhtml_function_coverage=1 00:03:35.651 --rc genhtml_legend=1 00:03:35.651 --rc geninfo_all_blocks=1 00:03:35.651 --rc geninfo_unexecuted_blocks=1 00:03:35.651 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:35.651 ' 00:03:35.651 05:28:46 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:35.651 05:28:46 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:35.651 05:28:46 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:35.651 05:28:46 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:35.651 05:28:46 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:35.651 05:28:46 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:35.651 05:28:46 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:35.651 05:28:46 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:35.651 05:28:46 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:35.651 05:28:46 -- setup/acl.sh@12 -- # devs=() 00:03:35.651 05:28:46 -- setup/acl.sh@12 -- # declare -a devs 00:03:35.651 05:28:46 -- setup/acl.sh@13 -- # drivers=() 00:03:35.651 05:28:46 -- setup/acl.sh@13 -- # declare -A drivers 00:03:35.651 05:28:46 -- setup/acl.sh@51 -- # setup reset 00:03:35.651 05:28:46 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:35.651 05:28:46 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:39.840 05:28:50 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:39.840 05:28:50 -- setup/acl.sh@16 -- # local dev driver 00:03:39.840 05:28:50 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:39.840 05:28:50 -- setup/acl.sh@15 -- # setup output status 00:03:39.840 05:28:50 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:39.840 05:28:50 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:43.130 Hugepages 00:03:43.130 node hugesize free / total 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 00:03:43.130 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:53 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:53 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:54 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:54 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:43.130 05:28:54 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:54 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:54 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:54 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:43.130 05:28:54 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:43.130 05:28:54 -- setup/acl.sh@20 -- # continue 00:03:43.130 05:28:54 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:54 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:43.130 05:28:54 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:43.130 05:28:54 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:43.130 05:28:54 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:43.130 05:28:54 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:43.130 05:28:54 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.130 05:28:54 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:43.130 05:28:54 -- setup/acl.sh@54 -- # run_test denied denied 00:03:43.130 05:28:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:43.130 05:28:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:43.130 05:28:54 -- common/autotest_common.sh@10 -- # set +x 00:03:43.130 ************************************ 00:03:43.130 START TEST denied 00:03:43.130 ************************************ 00:03:43.131 05:28:54 -- common/autotest_common.sh@1114 -- # denied 00:03:43.131 05:28:54 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:43.131 05:28:54 -- setup/acl.sh@38 -- # setup output config 00:03:43.131 05:28:54 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:43.131 05:28:54 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:43.131 05:28:54 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:47.316 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:47.316 05:28:57 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:47.316 05:28:57 -- setup/acl.sh@28 -- # local dev driver 00:03:47.316 05:28:57 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:47.316 05:28:57 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:47.316 05:28:57 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:47.316 05:28:57 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:47.316 05:28:57 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:47.316 05:28:57 -- setup/acl.sh@41 -- # setup reset 00:03:47.316 05:28:57 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:47.316 05:28:57 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:51.505 00:03:51.505 real 0m8.328s 00:03:51.505 user 0m2.747s 00:03:51.505 sys 0m4.978s 00:03:51.505 05:29:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:51.505 05:29:02 -- common/autotest_common.sh@10 -- # set +x 00:03:51.505 ************************************ 00:03:51.505 END TEST denied 00:03:51.505 ************************************ 00:03:51.505 05:29:02 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:51.505 05:29:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:51.505 05:29:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:51.505 05:29:02 -- common/autotest_common.sh@10 -- # set +x 00:03:51.505 ************************************ 00:03:51.505 START TEST allowed 00:03:51.505 ************************************ 00:03:51.505 05:29:02 -- common/autotest_common.sh@1114 -- # allowed 00:03:51.505 05:29:02 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:51.505 05:29:02 -- setup/acl.sh@45 -- # setup output config 00:03:51.505 05:29:02 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:51.505 05:29:02 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:51.505 05:29:02 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:56.774 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:56.774 05:29:07 -- setup/acl.sh@47 -- # verify 00:03:56.774 05:29:07 -- setup/acl.sh@28 -- # local dev driver 00:03:56.774 05:29:07 -- setup/acl.sh@48 -- # setup reset 00:03:56.774 05:29:07 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:56.774 05:29:07 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:00.062 00:04:00.063 real 0m8.543s 00:04:00.063 user 0m2.408s 00:04:00.063 sys 0m4.652s 00:04:00.063 05:29:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:00.063 05:29:11 -- common/autotest_common.sh@10 -- # set +x 00:04:00.063 ************************************ 00:04:00.063 END TEST allowed 00:04:00.063 ************************************ 00:04:00.063 00:04:00.063 real 0m24.322s 00:04:00.063 user 0m7.829s 00:04:00.063 sys 0m14.656s 00:04:00.063 05:29:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:00.063 05:29:11 -- common/autotest_common.sh@10 -- # set +x 00:04:00.063 ************************************ 00:04:00.063 END TEST acl 00:04:00.063 ************************************ 00:04:00.063 05:29:11 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:00.063 05:29:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:00.063 05:29:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:00.063 05:29:11 -- common/autotest_common.sh@10 -- # set +x 00:04:00.063 ************************************ 00:04:00.063 START TEST hugepages 00:04:00.063 ************************************ 00:04:00.063 05:29:11 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:00.063 * Looking for test storage... 00:04:00.063 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:00.063 05:29:11 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:00.063 05:29:11 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:00.063 05:29:11 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:00.063 05:29:11 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:00.063 05:29:11 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:00.063 05:29:11 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:00.063 05:29:11 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:00.063 05:29:11 -- scripts/common.sh@335 -- # IFS=.-: 00:04:00.063 05:29:11 -- scripts/common.sh@335 -- # read -ra ver1 00:04:00.063 05:29:11 -- scripts/common.sh@336 -- # IFS=.-: 00:04:00.063 05:29:11 -- scripts/common.sh@336 -- # read -ra ver2 00:04:00.063 05:29:11 -- scripts/common.sh@337 -- # local 'op=<' 00:04:00.063 05:29:11 -- scripts/common.sh@339 -- # ver1_l=2 00:04:00.063 05:29:11 -- scripts/common.sh@340 -- # ver2_l=1 00:04:00.063 05:29:11 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:00.063 05:29:11 -- scripts/common.sh@343 -- # case "$op" in 00:04:00.063 05:29:11 -- scripts/common.sh@344 -- # : 1 00:04:00.063 05:29:11 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:00.063 05:29:11 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:00.063 05:29:11 -- scripts/common.sh@364 -- # decimal 1 00:04:00.063 05:29:11 -- scripts/common.sh@352 -- # local d=1 00:04:00.063 05:29:11 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:00.063 05:29:11 -- scripts/common.sh@354 -- # echo 1 00:04:00.063 05:29:11 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:00.063 05:29:11 -- scripts/common.sh@365 -- # decimal 2 00:04:00.063 05:29:11 -- scripts/common.sh@352 -- # local d=2 00:04:00.063 05:29:11 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:00.063 05:29:11 -- scripts/common.sh@354 -- # echo 2 00:04:00.063 05:29:11 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:00.063 05:29:11 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:00.063 05:29:11 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:00.063 05:29:11 -- scripts/common.sh@367 -- # return 0 00:04:00.063 05:29:11 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:00.063 05:29:11 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:00.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:00.063 --rc genhtml_branch_coverage=1 00:04:00.063 --rc genhtml_function_coverage=1 00:04:00.063 --rc genhtml_legend=1 00:04:00.063 --rc geninfo_all_blocks=1 00:04:00.063 --rc geninfo_unexecuted_blocks=1 00:04:00.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:00.063 ' 00:04:00.063 05:29:11 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:00.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:00.063 --rc genhtml_branch_coverage=1 00:04:00.063 --rc genhtml_function_coverage=1 00:04:00.063 --rc genhtml_legend=1 00:04:00.063 --rc geninfo_all_blocks=1 00:04:00.063 --rc geninfo_unexecuted_blocks=1 00:04:00.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:00.063 ' 00:04:00.063 05:29:11 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:00.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:00.063 --rc genhtml_branch_coverage=1 00:04:00.063 --rc genhtml_function_coverage=1 00:04:00.063 --rc genhtml_legend=1 00:04:00.063 --rc geninfo_all_blocks=1 00:04:00.063 --rc geninfo_unexecuted_blocks=1 00:04:00.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:00.063 ' 00:04:00.063 05:29:11 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:00.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:00.063 --rc genhtml_branch_coverage=1 00:04:00.063 --rc genhtml_function_coverage=1 00:04:00.063 --rc genhtml_legend=1 00:04:00.063 --rc geninfo_all_blocks=1 00:04:00.063 --rc geninfo_unexecuted_blocks=1 00:04:00.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:00.063 ' 00:04:00.063 05:29:11 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:00.063 05:29:11 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:00.063 05:29:11 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:00.063 05:29:11 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:00.063 05:29:11 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:00.063 05:29:11 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:00.063 05:29:11 -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:00.063 05:29:11 -- setup/common.sh@18 -- # local node= 00:04:00.063 05:29:11 -- setup/common.sh@19 -- # local var val 00:04:00.063 05:29:11 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.063 05:29:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.063 05:29:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.063 05:29:11 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.063 05:29:11 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.063 05:29:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.063 05:29:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40092744 kB' 'MemAvailable: 41733612 kB' 'Buffers: 6816 kB' 'Cached: 10628832 kB' 'SwapCached: 144 kB' 'Active: 8058384 kB' 'Inactive: 3166072 kB' 'Active(anon): 7150908 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592040 kB' 'Mapped: 149136 kB' 'Shmem: 8886948 kB' 'KReclaimable: 588968 kB' 'Slab: 1592748 kB' 'SReclaimable: 588968 kB' 'SUnreclaim: 1003780 kB' 'KernelStack: 21920 kB' 'PageTables: 8780 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 11412940 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217908 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.063 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.063 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.064 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.064 05:29:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # continue 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.324 05:29:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.324 05:29:11 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:00.324 05:29:11 -- setup/common.sh@33 -- # echo 2048 00:04:00.324 05:29:11 -- setup/common.sh@33 -- # return 0 00:04:00.324 05:29:11 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:00.324 05:29:11 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:00.324 05:29:11 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:00.324 05:29:11 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:00.324 05:29:11 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:00.324 05:29:11 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:00.324 05:29:11 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:00.324 05:29:11 -- setup/hugepages.sh@207 -- # get_nodes 00:04:00.324 05:29:11 -- setup/hugepages.sh@27 -- # local node 00:04:00.324 05:29:11 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:00.324 05:29:11 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:04:00.324 05:29:11 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:00.324 05:29:11 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:00.324 05:29:11 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:00.324 05:29:11 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:00.324 05:29:11 -- setup/hugepages.sh@208 -- # clear_hp 00:04:00.324 05:29:11 -- setup/hugepages.sh@37 -- # local node hp 00:04:00.324 05:29:11 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:00.324 05:29:11 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:00.324 05:29:11 -- setup/hugepages.sh@41 -- # echo 0 00:04:00.324 05:29:11 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:00.324 05:29:11 -- setup/hugepages.sh@41 -- # echo 0 00:04:00.324 05:29:11 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:00.324 05:29:11 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:00.324 05:29:11 -- setup/hugepages.sh@41 -- # echo 0 00:04:00.324 05:29:11 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:00.324 05:29:11 -- setup/hugepages.sh@41 -- # echo 0 00:04:00.324 05:29:11 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:00.324 05:29:11 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:00.324 05:29:11 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:00.324 05:29:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:00.324 05:29:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:00.324 05:29:11 -- common/autotest_common.sh@10 -- # set +x 00:04:00.324 ************************************ 00:04:00.324 START TEST default_setup 00:04:00.324 ************************************ 00:04:00.324 05:29:11 -- common/autotest_common.sh@1114 -- # default_setup 00:04:00.324 05:29:11 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:00.324 05:29:11 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:00.324 05:29:11 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:00.324 05:29:11 -- setup/hugepages.sh@51 -- # shift 00:04:00.324 05:29:11 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:00.324 05:29:11 -- setup/hugepages.sh@52 -- # local node_ids 00:04:00.324 05:29:11 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:00.324 05:29:11 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:00.324 05:29:11 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:00.324 05:29:11 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:00.324 05:29:11 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:00.324 05:29:11 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:00.324 05:29:11 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:00.324 05:29:11 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:00.324 05:29:11 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:00.325 05:29:11 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:00.325 05:29:11 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:00.325 05:29:11 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:00.325 05:29:11 -- setup/hugepages.sh@73 -- # return 0 00:04:00.325 05:29:11 -- setup/hugepages.sh@137 -- # setup output 00:04:00.325 05:29:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:00.325 05:29:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:03.610 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:03.610 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:03.610 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:03.610 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:03.610 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:03.610 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:03.610 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:03.610 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:03.610 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:03.610 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:03.869 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:03.869 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:03.869 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:03.869 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:03.869 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:03.869 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:05.247 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:05.511 05:29:16 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:05.511 05:29:16 -- setup/hugepages.sh@89 -- # local node 00:04:05.511 05:29:16 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:05.511 05:29:16 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:05.511 05:29:16 -- setup/hugepages.sh@92 -- # local surp 00:04:05.511 05:29:16 -- setup/hugepages.sh@93 -- # local resv 00:04:05.511 05:29:16 -- setup/hugepages.sh@94 -- # local anon 00:04:05.511 05:29:16 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:05.511 05:29:16 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:05.511 05:29:16 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:05.511 05:29:16 -- setup/common.sh@18 -- # local node= 00:04:05.511 05:29:16 -- setup/common.sh@19 -- # local var val 00:04:05.511 05:29:16 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.511 05:29:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.511 05:29:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.511 05:29:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.511 05:29:16 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.511 05:29:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42282740 kB' 'MemAvailable: 43923600 kB' 'Buffers: 6816 kB' 'Cached: 10628964 kB' 'SwapCached: 144 kB' 'Active: 8058856 kB' 'Inactive: 3166072 kB' 'Active(anon): 7151380 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592508 kB' 'Mapped: 148772 kB' 'Shmem: 8887080 kB' 'KReclaimable: 588960 kB' 'Slab: 1591892 kB' 'SReclaimable: 588960 kB' 'SUnreclaim: 1002932 kB' 'KernelStack: 21984 kB' 'PageTables: 8632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11413568 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217988 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.511 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.511 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.512 05:29:16 -- setup/common.sh@33 -- # echo 0 00:04:05.512 05:29:16 -- setup/common.sh@33 -- # return 0 00:04:05.512 05:29:16 -- setup/hugepages.sh@97 -- # anon=0 00:04:05.512 05:29:16 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:05.512 05:29:16 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.512 05:29:16 -- setup/common.sh@18 -- # local node= 00:04:05.512 05:29:16 -- setup/common.sh@19 -- # local var val 00:04:05.512 05:29:16 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.512 05:29:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.512 05:29:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.512 05:29:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.512 05:29:16 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.512 05:29:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42284204 kB' 'MemAvailable: 43925064 kB' 'Buffers: 6816 kB' 'Cached: 10628968 kB' 'SwapCached: 144 kB' 'Active: 8059708 kB' 'Inactive: 3166072 kB' 'Active(anon): 7152232 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593296 kB' 'Mapped: 148760 kB' 'Shmem: 8887084 kB' 'KReclaimable: 588960 kB' 'Slab: 1591880 kB' 'SReclaimable: 588960 kB' 'SUnreclaim: 1002920 kB' 'KernelStack: 22048 kB' 'PageTables: 9056 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11413580 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.512 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.512 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.513 05:29:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.513 05:29:16 -- setup/common.sh@33 -- # echo 0 00:04:05.513 05:29:16 -- setup/common.sh@33 -- # return 0 00:04:05.513 05:29:16 -- setup/hugepages.sh@99 -- # surp=0 00:04:05.513 05:29:16 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:05.513 05:29:16 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:05.513 05:29:16 -- setup/common.sh@18 -- # local node= 00:04:05.513 05:29:16 -- setup/common.sh@19 -- # local var val 00:04:05.513 05:29:16 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.513 05:29:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.513 05:29:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.513 05:29:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.513 05:29:16 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.513 05:29:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.513 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42285884 kB' 'MemAvailable: 43926744 kB' 'Buffers: 6816 kB' 'Cached: 10628972 kB' 'SwapCached: 144 kB' 'Active: 8058728 kB' 'Inactive: 3166072 kB' 'Active(anon): 7151252 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592308 kB' 'Mapped: 148680 kB' 'Shmem: 8887088 kB' 'KReclaimable: 588960 kB' 'Slab: 1591856 kB' 'SReclaimable: 588960 kB' 'SUnreclaim: 1002896 kB' 'KernelStack: 22016 kB' 'PageTables: 8316 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11413596 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.514 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.514 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.515 05:29:16 -- setup/common.sh@33 -- # echo 0 00:04:05.515 05:29:16 -- setup/common.sh@33 -- # return 0 00:04:05.515 05:29:16 -- setup/hugepages.sh@100 -- # resv=0 00:04:05.515 05:29:16 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:05.515 nr_hugepages=1024 00:04:05.515 05:29:16 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:05.515 resv_hugepages=0 00:04:05.515 05:29:16 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:05.515 surplus_hugepages=0 00:04:05.515 05:29:16 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:05.515 anon_hugepages=0 00:04:05.515 05:29:16 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:05.515 05:29:16 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:05.515 05:29:16 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:05.515 05:29:16 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:05.515 05:29:16 -- setup/common.sh@18 -- # local node= 00:04:05.515 05:29:16 -- setup/common.sh@19 -- # local var val 00:04:05.515 05:29:16 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.515 05:29:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.515 05:29:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.515 05:29:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.515 05:29:16 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.515 05:29:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42288980 kB' 'MemAvailable: 43929840 kB' 'Buffers: 6816 kB' 'Cached: 10628992 kB' 'SwapCached: 144 kB' 'Active: 8058948 kB' 'Inactive: 3166072 kB' 'Active(anon): 7151472 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592300 kB' 'Mapped: 148680 kB' 'Shmem: 8887108 kB' 'KReclaimable: 588960 kB' 'Slab: 1591824 kB' 'SReclaimable: 588960 kB' 'SUnreclaim: 1002864 kB' 'KernelStack: 22096 kB' 'PageTables: 9168 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11413608 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218164 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.515 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.515 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.516 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.516 05:29:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.516 05:29:16 -- setup/common.sh@33 -- # echo 1024 00:04:05.516 05:29:16 -- setup/common.sh@33 -- # return 0 00:04:05.516 05:29:16 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:05.516 05:29:16 -- setup/hugepages.sh@112 -- # get_nodes 00:04:05.516 05:29:16 -- setup/hugepages.sh@27 -- # local node 00:04:05.516 05:29:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.516 05:29:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:05.516 05:29:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.516 05:29:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:05.516 05:29:16 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:05.516 05:29:16 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:05.516 05:29:16 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:05.516 05:29:16 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:05.516 05:29:16 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:05.516 05:29:16 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.516 05:29:16 -- setup/common.sh@18 -- # local node=0 00:04:05.516 05:29:16 -- setup/common.sh@19 -- # local var val 00:04:05.516 05:29:16 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.516 05:29:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.517 05:29:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:05.517 05:29:16 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:05.517 05:29:16 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.517 05:29:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22111556 kB' 'MemUsed: 10522880 kB' 'SwapCached: 44 kB' 'Active: 5525636 kB' 'Inactive: 535260 kB' 'Active(anon): 4748076 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5804176 kB' 'Mapped: 92440 kB' 'AnonPages: 259700 kB' 'Shmem: 4491368 kB' 'KernelStack: 10328 kB' 'PageTables: 5756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 405008 kB' 'Slab: 888992 kB' 'SReclaimable: 405008 kB' 'SUnreclaim: 483984 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # continue 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.517 05:29:16 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.517 05:29:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.517 05:29:16 -- setup/common.sh@33 -- # echo 0 00:04:05.518 05:29:16 -- setup/common.sh@33 -- # return 0 00:04:05.518 05:29:16 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:05.518 05:29:16 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:05.518 05:29:16 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:05.518 05:29:16 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:05.518 05:29:16 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:05.518 node0=1024 expecting 1024 00:04:05.518 05:29:16 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:05.518 00:04:05.518 real 0m5.313s 00:04:05.518 user 0m1.427s 00:04:05.518 sys 0m2.449s 00:04:05.518 05:29:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:05.518 05:29:16 -- common/autotest_common.sh@10 -- # set +x 00:04:05.518 ************************************ 00:04:05.518 END TEST default_setup 00:04:05.518 ************************************ 00:04:05.518 05:29:16 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:05.518 05:29:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:05.518 05:29:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:05.518 05:29:16 -- common/autotest_common.sh@10 -- # set +x 00:04:05.518 ************************************ 00:04:05.518 START TEST per_node_1G_alloc 00:04:05.518 ************************************ 00:04:05.518 05:29:16 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:04:05.518 05:29:16 -- setup/hugepages.sh@143 -- # local IFS=, 00:04:05.518 05:29:16 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:05.518 05:29:16 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:05.518 05:29:16 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:05.518 05:29:16 -- setup/hugepages.sh@51 -- # shift 00:04:05.518 05:29:16 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:05.518 05:29:16 -- setup/hugepages.sh@52 -- # local node_ids 00:04:05.518 05:29:16 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:05.518 05:29:16 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:05.518 05:29:16 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:05.518 05:29:16 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:05.518 05:29:16 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:05.518 05:29:16 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:05.518 05:29:16 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:05.518 05:29:16 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:05.518 05:29:16 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:05.518 05:29:16 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:05.518 05:29:16 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:05.518 05:29:16 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:05.518 05:29:16 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:05.518 05:29:16 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:05.518 05:29:16 -- setup/hugepages.sh@73 -- # return 0 00:04:05.518 05:29:16 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:05.518 05:29:16 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:05.518 05:29:16 -- setup/hugepages.sh@146 -- # setup output 00:04:05.518 05:29:16 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:05.518 05:29:16 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:08.808 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:08.808 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:08.808 05:29:20 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:08.808 05:29:20 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:08.808 05:29:20 -- setup/hugepages.sh@89 -- # local node 00:04:08.808 05:29:20 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:08.808 05:29:20 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:08.808 05:29:20 -- setup/hugepages.sh@92 -- # local surp 00:04:08.808 05:29:20 -- setup/hugepages.sh@93 -- # local resv 00:04:08.808 05:29:20 -- setup/hugepages.sh@94 -- # local anon 00:04:08.808 05:29:20 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:08.808 05:29:20 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:08.808 05:29:20 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:08.808 05:29:20 -- setup/common.sh@18 -- # local node= 00:04:08.808 05:29:20 -- setup/common.sh@19 -- # local var val 00:04:08.808 05:29:20 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.808 05:29:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.808 05:29:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.808 05:29:20 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.808 05:29:20 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.808 05:29:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.808 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.808 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.808 05:29:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42323288 kB' 'MemAvailable: 43964148 kB' 'Buffers: 6816 kB' 'Cached: 10629080 kB' 'SwapCached: 144 kB' 'Active: 8059808 kB' 'Inactive: 3166072 kB' 'Active(anon): 7152332 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593520 kB' 'Mapped: 147576 kB' 'Shmem: 8887196 kB' 'KReclaimable: 588960 kB' 'Slab: 1591360 kB' 'SReclaimable: 588960 kB' 'SUnreclaim: 1002400 kB' 'KernelStack: 21824 kB' 'PageTables: 8412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11402076 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:08.808 05:29:20 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.808 05:29:20 -- setup/common.sh@32 -- # continue 00:04:08.808 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.808 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.808 05:29:20 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.808 05:29:20 -- setup/common.sh@32 -- # continue 00:04:08.808 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.808 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.808 05:29:20 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.808 05:29:20 -- setup/common.sh@32 -- # continue 00:04:08.808 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.808 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.808 05:29:20 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.808 05:29:20 -- setup/common.sh@32 -- # continue 00:04:08.808 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.808 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.808 05:29:20 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.808 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.071 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.071 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.072 05:29:20 -- setup/common.sh@33 -- # echo 0 00:04:09.072 05:29:20 -- setup/common.sh@33 -- # return 0 00:04:09.072 05:29:20 -- setup/hugepages.sh@97 -- # anon=0 00:04:09.072 05:29:20 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:09.072 05:29:20 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.072 05:29:20 -- setup/common.sh@18 -- # local node= 00:04:09.072 05:29:20 -- setup/common.sh@19 -- # local var val 00:04:09.072 05:29:20 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.072 05:29:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.072 05:29:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.072 05:29:20 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.072 05:29:20 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.072 05:29:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42324840 kB' 'MemAvailable: 43965668 kB' 'Buffers: 6816 kB' 'Cached: 10629084 kB' 'SwapCached: 144 kB' 'Active: 8059128 kB' 'Inactive: 3166072 kB' 'Active(anon): 7151652 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593016 kB' 'Mapped: 147524 kB' 'Shmem: 8887200 kB' 'KReclaimable: 588928 kB' 'Slab: 1591492 kB' 'SReclaimable: 588928 kB' 'SUnreclaim: 1002564 kB' 'KernelStack: 21856 kB' 'PageTables: 8408 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11402088 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.072 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.072 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.073 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.073 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.073 05:29:20 -- setup/common.sh@33 -- # echo 0 00:04:09.073 05:29:20 -- setup/common.sh@33 -- # return 0 00:04:09.073 05:29:20 -- setup/hugepages.sh@99 -- # surp=0 00:04:09.073 05:29:20 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:09.073 05:29:20 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:09.073 05:29:20 -- setup/common.sh@18 -- # local node= 00:04:09.073 05:29:20 -- setup/common.sh@19 -- # local var val 00:04:09.073 05:29:20 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.073 05:29:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.073 05:29:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.073 05:29:20 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.073 05:29:20 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.073 05:29:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.074 05:29:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42323840 kB' 'MemAvailable: 43964668 kB' 'Buffers: 6816 kB' 'Cached: 10629096 kB' 'SwapCached: 144 kB' 'Active: 8058988 kB' 'Inactive: 3166072 kB' 'Active(anon): 7151512 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592856 kB' 'Mapped: 147524 kB' 'Shmem: 8887212 kB' 'KReclaimable: 588928 kB' 'Slab: 1591492 kB' 'SReclaimable: 588928 kB' 'SUnreclaim: 1002564 kB' 'KernelStack: 21856 kB' 'PageTables: 8408 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11402100 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.074 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.074 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.075 05:29:20 -- setup/common.sh@33 -- # echo 0 00:04:09.075 05:29:20 -- setup/common.sh@33 -- # return 0 00:04:09.075 05:29:20 -- setup/hugepages.sh@100 -- # resv=0 00:04:09.075 05:29:20 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:09.075 nr_hugepages=1024 00:04:09.075 05:29:20 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:09.075 resv_hugepages=0 00:04:09.075 05:29:20 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:09.075 surplus_hugepages=0 00:04:09.075 05:29:20 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:09.075 anon_hugepages=0 00:04:09.075 05:29:20 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:09.075 05:29:20 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:09.075 05:29:20 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:09.075 05:29:20 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:09.075 05:29:20 -- setup/common.sh@18 -- # local node= 00:04:09.075 05:29:20 -- setup/common.sh@19 -- # local var val 00:04:09.075 05:29:20 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.075 05:29:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.075 05:29:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.075 05:29:20 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.075 05:29:20 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.075 05:29:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42323840 kB' 'MemAvailable: 43964668 kB' 'Buffers: 6816 kB' 'Cached: 10629112 kB' 'SwapCached: 144 kB' 'Active: 8058544 kB' 'Inactive: 3166072 kB' 'Active(anon): 7151068 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592340 kB' 'Mapped: 147524 kB' 'Shmem: 8887228 kB' 'KReclaimable: 588928 kB' 'Slab: 1591492 kB' 'SReclaimable: 588928 kB' 'SUnreclaim: 1002564 kB' 'KernelStack: 21840 kB' 'PageTables: 8356 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11402116 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.075 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.075 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.076 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.076 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.077 05:29:20 -- setup/common.sh@33 -- # echo 1024 00:04:09.077 05:29:20 -- setup/common.sh@33 -- # return 0 00:04:09.077 05:29:20 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:09.077 05:29:20 -- setup/hugepages.sh@112 -- # get_nodes 00:04:09.077 05:29:20 -- setup/hugepages.sh@27 -- # local node 00:04:09.077 05:29:20 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:09.077 05:29:20 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:09.077 05:29:20 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:09.077 05:29:20 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:09.077 05:29:20 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:09.077 05:29:20 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:09.077 05:29:20 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:09.077 05:29:20 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:09.077 05:29:20 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:09.077 05:29:20 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.077 05:29:20 -- setup/common.sh@18 -- # local node=0 00:04:09.077 05:29:20 -- setup/common.sh@19 -- # local var val 00:04:09.077 05:29:20 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.077 05:29:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.077 05:29:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:09.077 05:29:20 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:09.077 05:29:20 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.077 05:29:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23163140 kB' 'MemUsed: 9471296 kB' 'SwapCached: 44 kB' 'Active: 5524616 kB' 'Inactive: 535260 kB' 'Active(anon): 4747056 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5804204 kB' 'Mapped: 91616 kB' 'AnonPages: 259112 kB' 'Shmem: 4491396 kB' 'KernelStack: 10056 kB' 'PageTables: 4620 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 404976 kB' 'Slab: 888928 kB' 'SReclaimable: 404976 kB' 'SUnreclaim: 483952 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.077 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.077 05:29:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@33 -- # echo 0 00:04:09.078 05:29:20 -- setup/common.sh@33 -- # return 0 00:04:09.078 05:29:20 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:09.078 05:29:20 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:09.078 05:29:20 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:09.078 05:29:20 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:09.078 05:29:20 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.078 05:29:20 -- setup/common.sh@18 -- # local node=1 00:04:09.078 05:29:20 -- setup/common.sh@19 -- # local var val 00:04:09.078 05:29:20 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.078 05:29:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.078 05:29:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:09.078 05:29:20 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:09.078 05:29:20 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.078 05:29:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 19166524 kB' 'MemUsed: 8482836 kB' 'SwapCached: 100 kB' 'Active: 2533780 kB' 'Inactive: 2630812 kB' 'Active(anon): 2403864 kB' 'Inactive(anon): 2324792 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4831892 kB' 'Mapped: 55908 kB' 'AnonPages: 333004 kB' 'Shmem: 4395856 kB' 'KernelStack: 11784 kB' 'PageTables: 3744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 183936 kB' 'Slab: 702536 kB' 'SReclaimable: 183936 kB' 'SUnreclaim: 518600 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.078 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.078 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # continue 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.079 05:29:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.079 05:29:20 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.079 05:29:20 -- setup/common.sh@33 -- # echo 0 00:04:09.079 05:29:20 -- setup/common.sh@33 -- # return 0 00:04:09.079 05:29:20 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:09.079 05:29:20 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:09.079 05:29:20 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:09.079 05:29:20 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:09.079 05:29:20 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:09.079 node0=512 expecting 512 00:04:09.079 05:29:20 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:09.079 05:29:20 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:09.079 05:29:20 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:09.079 05:29:20 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:09.079 node1=512 expecting 512 00:04:09.079 05:29:20 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:09.079 00:04:09.079 real 0m3.546s 00:04:09.079 user 0m1.340s 00:04:09.079 sys 0m2.269s 00:04:09.079 05:29:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:09.079 05:29:20 -- common/autotest_common.sh@10 -- # set +x 00:04:09.079 ************************************ 00:04:09.079 END TEST per_node_1G_alloc 00:04:09.079 ************************************ 00:04:09.079 05:29:20 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:09.079 05:29:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:09.079 05:29:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:09.079 05:29:20 -- common/autotest_common.sh@10 -- # set +x 00:04:09.079 ************************************ 00:04:09.079 START TEST even_2G_alloc 00:04:09.079 ************************************ 00:04:09.079 05:29:20 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:04:09.079 05:29:20 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:09.079 05:29:20 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:09.079 05:29:20 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:09.079 05:29:20 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:09.079 05:29:20 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:09.079 05:29:20 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:09.079 05:29:20 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:09.079 05:29:20 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:09.079 05:29:20 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:09.079 05:29:20 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:09.079 05:29:20 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:09.079 05:29:20 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:09.079 05:29:20 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:09.079 05:29:20 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:09.079 05:29:20 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.079 05:29:20 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:09.079 05:29:20 -- setup/hugepages.sh@83 -- # : 512 00:04:09.079 05:29:20 -- setup/hugepages.sh@84 -- # : 1 00:04:09.079 05:29:20 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.079 05:29:20 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:09.079 05:29:20 -- setup/hugepages.sh@83 -- # : 0 00:04:09.079 05:29:20 -- setup/hugepages.sh@84 -- # : 0 00:04:09.079 05:29:20 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.079 05:29:20 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:09.079 05:29:20 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:09.079 05:29:20 -- setup/hugepages.sh@153 -- # setup output 00:04:09.079 05:29:20 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:09.079 05:29:20 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:13.276 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:13.276 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:13.276 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:13.276 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:13.277 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:13.277 05:29:23 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:13.277 05:29:23 -- setup/hugepages.sh@89 -- # local node 00:04:13.277 05:29:23 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:13.277 05:29:23 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:13.277 05:29:23 -- setup/hugepages.sh@92 -- # local surp 00:04:13.277 05:29:23 -- setup/hugepages.sh@93 -- # local resv 00:04:13.277 05:29:23 -- setup/hugepages.sh@94 -- # local anon 00:04:13.277 05:29:23 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:13.277 05:29:23 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:13.277 05:29:23 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:13.277 05:29:23 -- setup/common.sh@18 -- # local node= 00:04:13.277 05:29:23 -- setup/common.sh@19 -- # local var val 00:04:13.277 05:29:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.277 05:29:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.277 05:29:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.277 05:29:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.277 05:29:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.277 05:29:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42351280 kB' 'MemAvailable: 43992052 kB' 'Buffers: 6816 kB' 'Cached: 10629228 kB' 'SwapCached: 144 kB' 'Active: 8061224 kB' 'Inactive: 3166072 kB' 'Active(anon): 7153748 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593944 kB' 'Mapped: 148180 kB' 'Shmem: 8887344 kB' 'KReclaimable: 588872 kB' 'Slab: 1591888 kB' 'SReclaimable: 588872 kB' 'SUnreclaim: 1003016 kB' 'KernelStack: 21872 kB' 'PageTables: 8488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11405812 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218052 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.277 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.277 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.278 05:29:23 -- setup/common.sh@33 -- # echo 0 00:04:13.278 05:29:23 -- setup/common.sh@33 -- # return 0 00:04:13.278 05:29:23 -- setup/hugepages.sh@97 -- # anon=0 00:04:13.278 05:29:23 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:13.278 05:29:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.278 05:29:23 -- setup/common.sh@18 -- # local node= 00:04:13.278 05:29:23 -- setup/common.sh@19 -- # local var val 00:04:13.278 05:29:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.278 05:29:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.278 05:29:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.278 05:29:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.278 05:29:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.278 05:29:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42350404 kB' 'MemAvailable: 43991176 kB' 'Buffers: 6816 kB' 'Cached: 10629228 kB' 'SwapCached: 144 kB' 'Active: 8064004 kB' 'Inactive: 3166072 kB' 'Active(anon): 7156528 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597284 kB' 'Mapped: 148052 kB' 'Shmem: 8887344 kB' 'KReclaimable: 588872 kB' 'Slab: 1591876 kB' 'SReclaimable: 588872 kB' 'SUnreclaim: 1003004 kB' 'KernelStack: 21856 kB' 'PageTables: 8412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11409004 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218040 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.278 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.278 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.279 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.279 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.280 05:29:23 -- setup/common.sh@33 -- # echo 0 00:04:13.280 05:29:23 -- setup/common.sh@33 -- # return 0 00:04:13.280 05:29:23 -- setup/hugepages.sh@99 -- # surp=0 00:04:13.280 05:29:23 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:13.280 05:29:23 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:13.280 05:29:23 -- setup/common.sh@18 -- # local node= 00:04:13.280 05:29:23 -- setup/common.sh@19 -- # local var val 00:04:13.280 05:29:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.280 05:29:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.280 05:29:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.280 05:29:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.280 05:29:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.280 05:29:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42348376 kB' 'MemAvailable: 43989148 kB' 'Buffers: 6816 kB' 'Cached: 10629240 kB' 'SwapCached: 144 kB' 'Active: 8059756 kB' 'Inactive: 3166072 kB' 'Active(anon): 7152280 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592964 kB' 'Mapped: 148052 kB' 'Shmem: 8887356 kB' 'KReclaimable: 588872 kB' 'Slab: 1591876 kB' 'SReclaimable: 588872 kB' 'SUnreclaim: 1003004 kB' 'KernelStack: 21824 kB' 'PageTables: 8304 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11405308 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.280 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.280 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.281 05:29:23 -- setup/common.sh@33 -- # echo 0 00:04:13.281 05:29:23 -- setup/common.sh@33 -- # return 0 00:04:13.281 05:29:23 -- setup/hugepages.sh@100 -- # resv=0 00:04:13.281 05:29:23 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:13.281 nr_hugepages=1024 00:04:13.281 05:29:23 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:13.281 resv_hugepages=0 00:04:13.281 05:29:23 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:13.281 surplus_hugepages=0 00:04:13.281 05:29:23 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:13.281 anon_hugepages=0 00:04:13.281 05:29:23 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:13.281 05:29:23 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:13.281 05:29:23 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:13.281 05:29:23 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:13.281 05:29:23 -- setup/common.sh@18 -- # local node= 00:04:13.281 05:29:23 -- setup/common.sh@19 -- # local var val 00:04:13.281 05:29:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.281 05:29:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.281 05:29:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.281 05:29:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.281 05:29:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.281 05:29:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42342916 kB' 'MemAvailable: 43983688 kB' 'Buffers: 6816 kB' 'Cached: 10629268 kB' 'SwapCached: 144 kB' 'Active: 8063152 kB' 'Inactive: 3166072 kB' 'Active(anon): 7155676 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 596804 kB' 'Mapped: 148052 kB' 'Shmem: 8887384 kB' 'KReclaimable: 588872 kB' 'Slab: 1591876 kB' 'SReclaimable: 588872 kB' 'SUnreclaim: 1003004 kB' 'KernelStack: 21840 kB' 'PageTables: 8348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11409032 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218024 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.281 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.281 05:29:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.282 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.282 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.283 05:29:23 -- setup/common.sh@33 -- # echo 1024 00:04:13.283 05:29:23 -- setup/common.sh@33 -- # return 0 00:04:13.283 05:29:23 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:13.283 05:29:23 -- setup/hugepages.sh@112 -- # get_nodes 00:04:13.283 05:29:23 -- setup/hugepages.sh@27 -- # local node 00:04:13.283 05:29:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.283 05:29:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:13.283 05:29:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.283 05:29:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:13.283 05:29:23 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:13.283 05:29:23 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:13.283 05:29:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.283 05:29:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.283 05:29:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:13.283 05:29:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.283 05:29:23 -- setup/common.sh@18 -- # local node=0 00:04:13.283 05:29:23 -- setup/common.sh@19 -- # local var val 00:04:13.283 05:29:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.283 05:29:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.283 05:29:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:13.283 05:29:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:13.283 05:29:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.283 05:29:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23172376 kB' 'MemUsed: 9462060 kB' 'SwapCached: 44 kB' 'Active: 5525588 kB' 'Inactive: 535260 kB' 'Active(anon): 4748028 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5804288 kB' 'Mapped: 91640 kB' 'AnonPages: 259732 kB' 'Shmem: 4491480 kB' 'KernelStack: 10056 kB' 'PageTables: 4664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 404960 kB' 'Slab: 889028 kB' 'SReclaimable: 404960 kB' 'SUnreclaim: 484068 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:23 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:24 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.283 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.283 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.283 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.283 05:29:24 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@33 -- # echo 0 00:04:13.284 05:29:24 -- setup/common.sh@33 -- # return 0 00:04:13.284 05:29:24 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.284 05:29:24 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.284 05:29:24 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.284 05:29:24 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:13.284 05:29:24 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.284 05:29:24 -- setup/common.sh@18 -- # local node=1 00:04:13.284 05:29:24 -- setup/common.sh@19 -- # local var val 00:04:13.284 05:29:24 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.284 05:29:24 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.284 05:29:24 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:13.284 05:29:24 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:13.284 05:29:24 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.284 05:29:24 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 19170828 kB' 'MemUsed: 8478532 kB' 'SwapCached: 100 kB' 'Active: 2532856 kB' 'Inactive: 2630812 kB' 'Active(anon): 2402940 kB' 'Inactive(anon): 2324792 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4831944 kB' 'Mapped: 55908 kB' 'AnonPages: 331900 kB' 'Shmem: 4395908 kB' 'KernelStack: 11800 kB' 'PageTables: 3732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 183912 kB' 'Slab: 702848 kB' 'SReclaimable: 183912 kB' 'SUnreclaim: 518936 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.284 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.284 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # continue 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.285 05:29:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.285 05:29:24 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.285 05:29:24 -- setup/common.sh@33 -- # echo 0 00:04:13.285 05:29:24 -- setup/common.sh@33 -- # return 0 00:04:13.285 05:29:24 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.285 05:29:24 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.285 05:29:24 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.285 05:29:24 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.285 05:29:24 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:13.285 node0=512 expecting 512 00:04:13.285 05:29:24 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.285 05:29:24 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.285 05:29:24 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.285 05:29:24 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:13.285 node1=512 expecting 512 00:04:13.285 05:29:24 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:13.285 00:04:13.285 real 0m3.694s 00:04:13.285 user 0m1.354s 00:04:13.285 sys 0m2.416s 00:04:13.285 05:29:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:13.285 05:29:24 -- common/autotest_common.sh@10 -- # set +x 00:04:13.285 ************************************ 00:04:13.285 END TEST even_2G_alloc 00:04:13.285 ************************************ 00:04:13.285 05:29:24 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:13.285 05:29:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:13.285 05:29:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:13.285 05:29:24 -- common/autotest_common.sh@10 -- # set +x 00:04:13.285 ************************************ 00:04:13.285 START TEST odd_alloc 00:04:13.285 ************************************ 00:04:13.285 05:29:24 -- common/autotest_common.sh@1114 -- # odd_alloc 00:04:13.285 05:29:24 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:13.285 05:29:24 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:13.285 05:29:24 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:13.285 05:29:24 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:13.285 05:29:24 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:13.285 05:29:24 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:13.285 05:29:24 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:13.285 05:29:24 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:13.285 05:29:24 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:13.285 05:29:24 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:13.285 05:29:24 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:13.285 05:29:24 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:13.285 05:29:24 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:13.285 05:29:24 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:13.285 05:29:24 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:13.285 05:29:24 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:13.285 05:29:24 -- setup/hugepages.sh@83 -- # : 513 00:04:13.285 05:29:24 -- setup/hugepages.sh@84 -- # : 1 00:04:13.285 05:29:24 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:13.285 05:29:24 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:13.285 05:29:24 -- setup/hugepages.sh@83 -- # : 0 00:04:13.285 05:29:24 -- setup/hugepages.sh@84 -- # : 0 00:04:13.285 05:29:24 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:13.285 05:29:24 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:13.285 05:29:24 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:13.285 05:29:24 -- setup/hugepages.sh@160 -- # setup output 00:04:13.285 05:29:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:13.285 05:29:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:16.577 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:16.577 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:16.577 05:29:27 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:16.577 05:29:27 -- setup/hugepages.sh@89 -- # local node 00:04:16.577 05:29:27 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:16.577 05:29:27 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:16.577 05:29:27 -- setup/hugepages.sh@92 -- # local surp 00:04:16.577 05:29:27 -- setup/hugepages.sh@93 -- # local resv 00:04:16.577 05:29:27 -- setup/hugepages.sh@94 -- # local anon 00:04:16.577 05:29:27 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:16.577 05:29:27 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:16.577 05:29:27 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:16.577 05:29:27 -- setup/common.sh@18 -- # local node= 00:04:16.577 05:29:27 -- setup/common.sh@19 -- # local var val 00:04:16.577 05:29:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.578 05:29:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.578 05:29:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.578 05:29:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.578 05:29:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.578 05:29:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42351540 kB' 'MemAvailable: 43992312 kB' 'Buffers: 6816 kB' 'Cached: 10629352 kB' 'SwapCached: 144 kB' 'Active: 8061048 kB' 'Inactive: 3166072 kB' 'Active(anon): 7153572 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593764 kB' 'Mapped: 147628 kB' 'Shmem: 8887468 kB' 'KReclaimable: 588872 kB' 'Slab: 1591724 kB' 'SReclaimable: 588872 kB' 'SUnreclaim: 1002852 kB' 'KernelStack: 22000 kB' 'PageTables: 8732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11408196 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218164 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.578 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.578 05:29:27 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.579 05:29:27 -- setup/common.sh@33 -- # echo 0 00:04:16.579 05:29:27 -- setup/common.sh@33 -- # return 0 00:04:16.579 05:29:27 -- setup/hugepages.sh@97 -- # anon=0 00:04:16.579 05:29:27 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:16.579 05:29:27 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.579 05:29:27 -- setup/common.sh@18 -- # local node= 00:04:16.579 05:29:27 -- setup/common.sh@19 -- # local var val 00:04:16.579 05:29:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.579 05:29:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.579 05:29:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.579 05:29:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.579 05:29:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.579 05:29:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42353244 kB' 'MemAvailable: 43994016 kB' 'Buffers: 6816 kB' 'Cached: 10629360 kB' 'SwapCached: 144 kB' 'Active: 8061412 kB' 'Inactive: 3166072 kB' 'Active(anon): 7153936 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 594576 kB' 'Mapped: 147552 kB' 'Shmem: 8887476 kB' 'KReclaimable: 588872 kB' 'Slab: 1591556 kB' 'SReclaimable: 588872 kB' 'SUnreclaim: 1002684 kB' 'KernelStack: 22160 kB' 'PageTables: 8972 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11408212 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.579 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.579 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.580 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.580 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.581 05:29:27 -- setup/common.sh@33 -- # echo 0 00:04:16.581 05:29:27 -- setup/common.sh@33 -- # return 0 00:04:16.581 05:29:27 -- setup/hugepages.sh@99 -- # surp=0 00:04:16.581 05:29:27 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:16.581 05:29:27 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:16.581 05:29:27 -- setup/common.sh@18 -- # local node= 00:04:16.581 05:29:27 -- setup/common.sh@19 -- # local var val 00:04:16.581 05:29:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.581 05:29:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.581 05:29:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.581 05:29:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.581 05:29:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.581 05:29:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42356352 kB' 'MemAvailable: 43997124 kB' 'Buffers: 6816 kB' 'Cached: 10629372 kB' 'SwapCached: 144 kB' 'Active: 8060932 kB' 'Inactive: 3166072 kB' 'Active(anon): 7153456 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 594092 kB' 'Mapped: 147552 kB' 'Shmem: 8887488 kB' 'KReclaimable: 588872 kB' 'Slab: 1591492 kB' 'SReclaimable: 588872 kB' 'SUnreclaim: 1002620 kB' 'KernelStack: 22144 kB' 'PageTables: 9164 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11408228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218196 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.581 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.581 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.582 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.582 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.583 05:29:27 -- setup/common.sh@33 -- # echo 0 00:04:16.583 05:29:27 -- setup/common.sh@33 -- # return 0 00:04:16.583 05:29:27 -- setup/hugepages.sh@100 -- # resv=0 00:04:16.583 05:29:27 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:16.583 nr_hugepages=1025 00:04:16.583 05:29:27 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:16.583 resv_hugepages=0 00:04:16.583 05:29:27 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:16.583 surplus_hugepages=0 00:04:16.583 05:29:27 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:16.583 anon_hugepages=0 00:04:16.583 05:29:27 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:16.583 05:29:27 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:16.583 05:29:27 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:16.583 05:29:27 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:16.583 05:29:27 -- setup/common.sh@18 -- # local node= 00:04:16.583 05:29:27 -- setup/common.sh@19 -- # local var val 00:04:16.583 05:29:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.583 05:29:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.583 05:29:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.583 05:29:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.583 05:29:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.583 05:29:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42356012 kB' 'MemAvailable: 43996784 kB' 'Buffers: 6816 kB' 'Cached: 10629372 kB' 'SwapCached: 144 kB' 'Active: 8060780 kB' 'Inactive: 3166072 kB' 'Active(anon): 7153304 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593880 kB' 'Mapped: 147552 kB' 'Shmem: 8887488 kB' 'KReclaimable: 588872 kB' 'Slab: 1591364 kB' 'SReclaimable: 588872 kB' 'SUnreclaim: 1002492 kB' 'KernelStack: 22224 kB' 'PageTables: 9132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11408240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218196 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.583 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.583 05:29:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.584 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.584 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.585 05:29:27 -- setup/common.sh@33 -- # echo 1025 00:04:16.585 05:29:27 -- setup/common.sh@33 -- # return 0 00:04:16.585 05:29:27 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:16.585 05:29:27 -- setup/hugepages.sh@112 -- # get_nodes 00:04:16.585 05:29:27 -- setup/hugepages.sh@27 -- # local node 00:04:16.585 05:29:27 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.585 05:29:27 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:16.585 05:29:27 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.585 05:29:27 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:16.585 05:29:27 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:16.585 05:29:27 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:16.585 05:29:27 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:16.585 05:29:27 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:16.585 05:29:27 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:16.585 05:29:27 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.585 05:29:27 -- setup/common.sh@18 -- # local node=0 00:04:16.585 05:29:27 -- setup/common.sh@19 -- # local var val 00:04:16.585 05:29:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.585 05:29:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.585 05:29:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:16.585 05:29:27 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:16.585 05:29:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.585 05:29:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23184680 kB' 'MemUsed: 9449756 kB' 'SwapCached: 44 kB' 'Active: 5526028 kB' 'Inactive: 535260 kB' 'Active(anon): 4748468 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5804320 kB' 'Mapped: 91644 kB' 'AnonPages: 260132 kB' 'Shmem: 4491512 kB' 'KernelStack: 10312 kB' 'PageTables: 5168 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 404960 kB' 'Slab: 888516 kB' 'SReclaimable: 404960 kB' 'SUnreclaim: 483556 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.585 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.585 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.586 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.586 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@33 -- # echo 0 00:04:16.587 05:29:27 -- setup/common.sh@33 -- # return 0 00:04:16.587 05:29:27 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:16.587 05:29:27 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:16.587 05:29:27 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:16.587 05:29:27 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:16.587 05:29:27 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.587 05:29:27 -- setup/common.sh@18 -- # local node=1 00:04:16.587 05:29:27 -- setup/common.sh@19 -- # local var val 00:04:16.587 05:29:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.587 05:29:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.587 05:29:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:16.587 05:29:27 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:16.587 05:29:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.587 05:29:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 19171328 kB' 'MemUsed: 8478032 kB' 'SwapCached: 100 kB' 'Active: 2534860 kB' 'Inactive: 2630812 kB' 'Active(anon): 2404944 kB' 'Inactive(anon): 2324792 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4832012 kB' 'Mapped: 55908 kB' 'AnonPages: 333336 kB' 'Shmem: 4395976 kB' 'KernelStack: 11800 kB' 'PageTables: 3744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 183912 kB' 'Slab: 702848 kB' 'SReclaimable: 183912 kB' 'SUnreclaim: 518936 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.587 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.587 05:29:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # continue 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.588 05:29:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.588 05:29:27 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.588 05:29:27 -- setup/common.sh@33 -- # echo 0 00:04:16.588 05:29:27 -- setup/common.sh@33 -- # return 0 00:04:16.588 05:29:27 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:16.588 05:29:27 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:16.588 05:29:27 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:16.588 05:29:27 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:16.588 05:29:27 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:16.588 node0=512 expecting 513 00:04:16.588 05:29:27 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:16.588 05:29:27 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:16.588 05:29:27 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:16.588 05:29:27 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:16.588 node1=513 expecting 512 00:04:16.588 05:29:27 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:16.588 00:04:16.588 real 0m3.531s 00:04:16.588 user 0m1.352s 00:04:16.588 sys 0m2.253s 00:04:16.588 05:29:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:16.588 05:29:27 -- common/autotest_common.sh@10 -- # set +x 00:04:16.588 ************************************ 00:04:16.588 END TEST odd_alloc 00:04:16.588 ************************************ 00:04:16.588 05:29:27 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:16.588 05:29:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:16.588 05:29:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:16.588 05:29:27 -- common/autotest_common.sh@10 -- # set +x 00:04:16.588 ************************************ 00:04:16.588 START TEST custom_alloc 00:04:16.588 ************************************ 00:04:16.588 05:29:27 -- common/autotest_common.sh@1114 -- # custom_alloc 00:04:16.588 05:29:27 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:16.588 05:29:27 -- setup/hugepages.sh@169 -- # local node 00:04:16.588 05:29:27 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:16.588 05:29:27 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:16.588 05:29:27 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:16.588 05:29:27 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:16.588 05:29:27 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:16.588 05:29:27 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:16.588 05:29:27 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:16.588 05:29:27 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:16.588 05:29:27 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:16.588 05:29:27 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:16.588 05:29:27 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:16.588 05:29:27 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:16.588 05:29:27 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:16.588 05:29:27 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:16.588 05:29:27 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:16.588 05:29:27 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:16.588 05:29:27 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:16.588 05:29:27 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.588 05:29:27 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:16.588 05:29:27 -- setup/hugepages.sh@83 -- # : 256 00:04:16.588 05:29:27 -- setup/hugepages.sh@84 -- # : 1 00:04:16.589 05:29:27 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.589 05:29:27 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:16.589 05:29:27 -- setup/hugepages.sh@83 -- # : 0 00:04:16.589 05:29:27 -- setup/hugepages.sh@84 -- # : 0 00:04:16.589 05:29:27 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.589 05:29:27 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:16.589 05:29:27 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:16.589 05:29:27 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:16.589 05:29:27 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:16.589 05:29:27 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:16.589 05:29:27 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:16.589 05:29:27 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:16.589 05:29:27 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:16.589 05:29:27 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:16.589 05:29:27 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:16.589 05:29:27 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:16.589 05:29:27 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:16.589 05:29:27 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:16.589 05:29:27 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:16.589 05:29:27 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:16.589 05:29:27 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:16.589 05:29:27 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:16.589 05:29:27 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:16.589 05:29:27 -- setup/hugepages.sh@78 -- # return 0 00:04:16.589 05:29:27 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:16.589 05:29:27 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:16.589 05:29:27 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:16.589 05:29:27 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:16.589 05:29:27 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:16.589 05:29:27 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:16.589 05:29:27 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:16.589 05:29:27 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:16.589 05:29:27 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:16.589 05:29:27 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:16.589 05:29:27 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:16.589 05:29:27 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:16.589 05:29:27 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:16.589 05:29:27 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:16.589 05:29:27 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:16.589 05:29:27 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:16.589 05:29:27 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:16.589 05:29:27 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:16.589 05:29:27 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:16.589 05:29:27 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:16.589 05:29:27 -- setup/hugepages.sh@78 -- # return 0 00:04:16.589 05:29:27 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:16.589 05:29:27 -- setup/hugepages.sh@187 -- # setup output 00:04:16.589 05:29:27 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:16.589 05:29:27 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:19.877 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:19.877 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:19.877 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:19.877 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:19.877 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:19.877 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:19.877 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:19.877 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:19.877 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:19.877 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:19.877 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:19.877 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:19.878 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:19.878 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:19.878 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:19.878 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:19.878 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:19.878 05:29:30 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:19.878 05:29:30 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:19.878 05:29:30 -- setup/hugepages.sh@89 -- # local node 00:04:19.878 05:29:30 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:19.878 05:29:30 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:19.878 05:29:30 -- setup/hugepages.sh@92 -- # local surp 00:04:19.878 05:29:30 -- setup/hugepages.sh@93 -- # local resv 00:04:19.878 05:29:30 -- setup/hugepages.sh@94 -- # local anon 00:04:19.878 05:29:30 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:19.878 05:29:30 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:19.878 05:29:30 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:19.878 05:29:30 -- setup/common.sh@18 -- # local node= 00:04:19.878 05:29:30 -- setup/common.sh@19 -- # local var val 00:04:19.878 05:29:30 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.878 05:29:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.878 05:29:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:19.878 05:29:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:19.878 05:29:30 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.878 05:29:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41296328 kB' 'MemAvailable: 42937052 kB' 'Buffers: 6816 kB' 'Cached: 10629488 kB' 'SwapCached: 144 kB' 'Active: 8060088 kB' 'Inactive: 3166072 kB' 'Active(anon): 7152612 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593048 kB' 'Mapped: 147564 kB' 'Shmem: 8887604 kB' 'KReclaimable: 588824 kB' 'Slab: 1592336 kB' 'SReclaimable: 588824 kB' 'SUnreclaim: 1003512 kB' 'KernelStack: 21856 kB' 'PageTables: 8456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11404308 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.878 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.878 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.879 05:29:30 -- setup/common.sh@33 -- # echo 0 00:04:19.879 05:29:30 -- setup/common.sh@33 -- # return 0 00:04:19.879 05:29:30 -- setup/hugepages.sh@97 -- # anon=0 00:04:19.879 05:29:30 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:19.879 05:29:30 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:19.879 05:29:30 -- setup/common.sh@18 -- # local node= 00:04:19.879 05:29:30 -- setup/common.sh@19 -- # local var val 00:04:19.879 05:29:30 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.879 05:29:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.879 05:29:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:19.879 05:29:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:19.879 05:29:30 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.879 05:29:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41298480 kB' 'MemAvailable: 42939204 kB' 'Buffers: 6816 kB' 'Cached: 10629488 kB' 'SwapCached: 144 kB' 'Active: 8059776 kB' 'Inactive: 3166072 kB' 'Active(anon): 7152300 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592740 kB' 'Mapped: 147564 kB' 'Shmem: 8887604 kB' 'KReclaimable: 588824 kB' 'Slab: 1592400 kB' 'SReclaimable: 588824 kB' 'SUnreclaim: 1003576 kB' 'KernelStack: 21840 kB' 'PageTables: 8412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11404320 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218052 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.879 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.879 05:29:30 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.880 05:29:30 -- setup/common.sh@33 -- # echo 0 00:04:19.880 05:29:30 -- setup/common.sh@33 -- # return 0 00:04:19.880 05:29:30 -- setup/hugepages.sh@99 -- # surp=0 00:04:19.880 05:29:30 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:19.880 05:29:30 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:19.880 05:29:30 -- setup/common.sh@18 -- # local node= 00:04:19.880 05:29:30 -- setup/common.sh@19 -- # local var val 00:04:19.880 05:29:30 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.880 05:29:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.880 05:29:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:19.880 05:29:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:19.880 05:29:30 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.880 05:29:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:30 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41298844 kB' 'MemAvailable: 42939568 kB' 'Buffers: 6816 kB' 'Cached: 10629500 kB' 'SwapCached: 144 kB' 'Active: 8059780 kB' 'Inactive: 3166072 kB' 'Active(anon): 7152304 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592740 kB' 'Mapped: 147564 kB' 'Shmem: 8887616 kB' 'KReclaimable: 588824 kB' 'Slab: 1592400 kB' 'SReclaimable: 588824 kB' 'SUnreclaim: 1003576 kB' 'KernelStack: 21840 kB' 'PageTables: 8412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11404336 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218052 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:19.880 05:29:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.880 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.880 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.880 05:29:31 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.880 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.880 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.880 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.881 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.881 05:29:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.882 05:29:31 -- setup/common.sh@33 -- # echo 0 00:04:19.882 05:29:31 -- setup/common.sh@33 -- # return 0 00:04:19.882 05:29:31 -- setup/hugepages.sh@100 -- # resv=0 00:04:19.882 05:29:31 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:19.882 nr_hugepages=1536 00:04:19.882 05:29:31 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:19.882 resv_hugepages=0 00:04:19.882 05:29:31 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:19.882 surplus_hugepages=0 00:04:19.882 05:29:31 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:19.882 anon_hugepages=0 00:04:19.882 05:29:31 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:19.882 05:29:31 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:19.882 05:29:31 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:19.882 05:29:31 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:19.882 05:29:31 -- setup/common.sh@18 -- # local node= 00:04:19.882 05:29:31 -- setup/common.sh@19 -- # local var val 00:04:19.882 05:29:31 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.882 05:29:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.882 05:29:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:19.882 05:29:31 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:19.882 05:29:31 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.882 05:29:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41298688 kB' 'MemAvailable: 42939412 kB' 'Buffers: 6816 kB' 'Cached: 10629528 kB' 'SwapCached: 144 kB' 'Active: 8059456 kB' 'Inactive: 3166072 kB' 'Active(anon): 7151980 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592356 kB' 'Mapped: 147564 kB' 'Shmem: 8887644 kB' 'KReclaimable: 588824 kB' 'Slab: 1592400 kB' 'SReclaimable: 588824 kB' 'SUnreclaim: 1003576 kB' 'KernelStack: 21824 kB' 'PageTables: 8360 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11404348 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.882 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.882 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.883 05:29:31 -- setup/common.sh@33 -- # echo 1536 00:04:19.883 05:29:31 -- setup/common.sh@33 -- # return 0 00:04:19.883 05:29:31 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:19.883 05:29:31 -- setup/hugepages.sh@112 -- # get_nodes 00:04:19.883 05:29:31 -- setup/hugepages.sh@27 -- # local node 00:04:19.883 05:29:31 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:19.883 05:29:31 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:19.883 05:29:31 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:19.883 05:29:31 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:19.883 05:29:31 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:19.883 05:29:31 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:19.883 05:29:31 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:19.883 05:29:31 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:19.883 05:29:31 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:19.883 05:29:31 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:19.883 05:29:31 -- setup/common.sh@18 -- # local node=0 00:04:19.883 05:29:31 -- setup/common.sh@19 -- # local var val 00:04:19.883 05:29:31 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.883 05:29:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.883 05:29:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:19.883 05:29:31 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:19.883 05:29:31 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.883 05:29:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23182496 kB' 'MemUsed: 9451940 kB' 'SwapCached: 44 kB' 'Active: 5525932 kB' 'Inactive: 535260 kB' 'Active(anon): 4748372 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5804356 kB' 'Mapped: 91656 kB' 'AnonPages: 260044 kB' 'Shmem: 4491548 kB' 'KernelStack: 10056 kB' 'PageTables: 4760 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 404920 kB' 'Slab: 889528 kB' 'SReclaimable: 404920 kB' 'SUnreclaim: 484608 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.883 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.883 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.884 05:29:31 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.884 05:29:31 -- setup/common.sh@33 -- # echo 0 00:04:19.884 05:29:31 -- setup/common.sh@33 -- # return 0 00:04:19.884 05:29:31 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:19.884 05:29:31 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:19.884 05:29:31 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:19.884 05:29:31 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:19.884 05:29:31 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:19.884 05:29:31 -- setup/common.sh@18 -- # local node=1 00:04:19.884 05:29:31 -- setup/common.sh@19 -- # local var val 00:04:19.884 05:29:31 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.884 05:29:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.884 05:29:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:19.884 05:29:31 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:19.884 05:29:31 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.884 05:29:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.884 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18116396 kB' 'MemUsed: 9532964 kB' 'SwapCached: 100 kB' 'Active: 2533896 kB' 'Inactive: 2630812 kB' 'Active(anon): 2403980 kB' 'Inactive(anon): 2324792 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4832148 kB' 'Mapped: 55908 kB' 'AnonPages: 332700 kB' 'Shmem: 4396112 kB' 'KernelStack: 11784 kB' 'PageTables: 3652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 183904 kB' 'Slab: 702872 kB' 'SReclaimable: 183904 kB' 'SUnreclaim: 518968 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # continue 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.885 05:29:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.885 05:29:31 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.885 05:29:31 -- setup/common.sh@33 -- # echo 0 00:04:19.885 05:29:31 -- setup/common.sh@33 -- # return 0 00:04:19.885 05:29:31 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:19.885 05:29:31 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:19.885 05:29:31 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:19.885 05:29:31 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:19.885 05:29:31 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:19.885 node0=512 expecting 512 00:04:19.885 05:29:31 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:19.885 05:29:31 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:19.885 05:29:31 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:19.885 05:29:31 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:19.885 node1=1024 expecting 1024 00:04:19.885 05:29:31 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:19.885 00:04:19.885 real 0m3.444s 00:04:19.885 user 0m1.224s 00:04:19.885 sys 0m2.249s 00:04:19.885 05:29:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:19.885 05:29:31 -- common/autotest_common.sh@10 -- # set +x 00:04:19.886 ************************************ 00:04:19.886 END TEST custom_alloc 00:04:19.886 ************************************ 00:04:19.886 05:29:31 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:19.886 05:29:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:19.886 05:29:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:19.886 05:29:31 -- common/autotest_common.sh@10 -- # set +x 00:04:19.886 ************************************ 00:04:19.886 START TEST no_shrink_alloc 00:04:19.886 ************************************ 00:04:19.886 05:29:31 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:04:19.886 05:29:31 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:19.886 05:29:31 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:19.886 05:29:31 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:19.886 05:29:31 -- setup/hugepages.sh@51 -- # shift 00:04:19.886 05:29:31 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:19.886 05:29:31 -- setup/hugepages.sh@52 -- # local node_ids 00:04:19.886 05:29:31 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:19.886 05:29:31 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:19.886 05:29:31 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:19.886 05:29:31 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:19.886 05:29:31 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:19.886 05:29:31 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:20.144 05:29:31 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:20.144 05:29:31 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:20.144 05:29:31 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:20.144 05:29:31 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:20.144 05:29:31 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:20.144 05:29:31 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:20.144 05:29:31 -- setup/hugepages.sh@73 -- # return 0 00:04:20.144 05:29:31 -- setup/hugepages.sh@198 -- # setup output 00:04:20.144 05:29:31 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:20.144 05:29:31 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:23.431 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:23.431 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:23.431 05:29:34 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:23.431 05:29:34 -- setup/hugepages.sh@89 -- # local node 00:04:23.431 05:29:34 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:23.431 05:29:34 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:23.431 05:29:34 -- setup/hugepages.sh@92 -- # local surp 00:04:23.431 05:29:34 -- setup/hugepages.sh@93 -- # local resv 00:04:23.431 05:29:34 -- setup/hugepages.sh@94 -- # local anon 00:04:23.431 05:29:34 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:23.431 05:29:34 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:23.431 05:29:34 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:23.431 05:29:34 -- setup/common.sh@18 -- # local node= 00:04:23.431 05:29:34 -- setup/common.sh@19 -- # local var val 00:04:23.431 05:29:34 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.431 05:29:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.431 05:29:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.431 05:29:34 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.431 05:29:34 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.431 05:29:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42333284 kB' 'MemAvailable: 43973976 kB' 'Buffers: 6816 kB' 'Cached: 10629620 kB' 'SwapCached: 144 kB' 'Active: 8062388 kB' 'Inactive: 3166072 kB' 'Active(anon): 7154912 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 594696 kB' 'Mapped: 147680 kB' 'Shmem: 8887736 kB' 'KReclaimable: 588792 kB' 'Slab: 1591736 kB' 'SReclaimable: 588792 kB' 'SUnreclaim: 1002944 kB' 'KernelStack: 21840 kB' 'PageTables: 8440 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11404956 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.431 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.431 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.432 05:29:34 -- setup/common.sh@33 -- # echo 0 00:04:23.432 05:29:34 -- setup/common.sh@33 -- # return 0 00:04:23.432 05:29:34 -- setup/hugepages.sh@97 -- # anon=0 00:04:23.432 05:29:34 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:23.432 05:29:34 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:23.432 05:29:34 -- setup/common.sh@18 -- # local node= 00:04:23.432 05:29:34 -- setup/common.sh@19 -- # local var val 00:04:23.432 05:29:34 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.432 05:29:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.432 05:29:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.432 05:29:34 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.432 05:29:34 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.432 05:29:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42333532 kB' 'MemAvailable: 43974224 kB' 'Buffers: 6816 kB' 'Cached: 10629624 kB' 'SwapCached: 144 kB' 'Active: 8061264 kB' 'Inactive: 3166072 kB' 'Active(anon): 7153788 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 594076 kB' 'Mapped: 147568 kB' 'Shmem: 8887740 kB' 'KReclaimable: 588792 kB' 'Slab: 1591700 kB' 'SReclaimable: 588792 kB' 'SUnreclaim: 1002908 kB' 'KernelStack: 21840 kB' 'PageTables: 8416 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11404968 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.432 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.432 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.433 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.433 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.434 05:29:34 -- setup/common.sh@33 -- # echo 0 00:04:23.434 05:29:34 -- setup/common.sh@33 -- # return 0 00:04:23.434 05:29:34 -- setup/hugepages.sh@99 -- # surp=0 00:04:23.434 05:29:34 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:23.434 05:29:34 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:23.434 05:29:34 -- setup/common.sh@18 -- # local node= 00:04:23.434 05:29:34 -- setup/common.sh@19 -- # local var val 00:04:23.434 05:29:34 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.434 05:29:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.434 05:29:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.434 05:29:34 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.434 05:29:34 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.434 05:29:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.434 05:29:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42333532 kB' 'MemAvailable: 43974224 kB' 'Buffers: 6816 kB' 'Cached: 10629628 kB' 'SwapCached: 144 kB' 'Active: 8060956 kB' 'Inactive: 3166072 kB' 'Active(anon): 7153480 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593760 kB' 'Mapped: 147568 kB' 'Shmem: 8887744 kB' 'KReclaimable: 588792 kB' 'Slab: 1591700 kB' 'SReclaimable: 588792 kB' 'SUnreclaim: 1002908 kB' 'KernelStack: 21840 kB' 'PageTables: 8416 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11404984 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.434 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.434 05:29:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.695 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.696 05:29:34 -- setup/common.sh@33 -- # echo 0 00:04:23.696 05:29:34 -- setup/common.sh@33 -- # return 0 00:04:23.696 05:29:34 -- setup/hugepages.sh@100 -- # resv=0 00:04:23.696 05:29:34 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:23.696 nr_hugepages=1024 00:04:23.696 05:29:34 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:23.696 resv_hugepages=0 00:04:23.696 05:29:34 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:23.696 surplus_hugepages=0 00:04:23.696 05:29:34 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:23.696 anon_hugepages=0 00:04:23.696 05:29:34 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:23.696 05:29:34 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:23.696 05:29:34 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:23.696 05:29:34 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:23.696 05:29:34 -- setup/common.sh@18 -- # local node= 00:04:23.696 05:29:34 -- setup/common.sh@19 -- # local var val 00:04:23.696 05:29:34 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.696 05:29:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.696 05:29:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.696 05:29:34 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.696 05:29:34 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.696 05:29:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42333384 kB' 'MemAvailable: 43974076 kB' 'Buffers: 6816 kB' 'Cached: 10629648 kB' 'SwapCached: 144 kB' 'Active: 8060892 kB' 'Inactive: 3166072 kB' 'Active(anon): 7153416 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593648 kB' 'Mapped: 147568 kB' 'Shmem: 8887764 kB' 'KReclaimable: 588792 kB' 'Slab: 1591700 kB' 'SReclaimable: 588792 kB' 'SUnreclaim: 1002908 kB' 'KernelStack: 21824 kB' 'PageTables: 8364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11404996 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.696 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.696 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.697 05:29:34 -- setup/common.sh@33 -- # echo 1024 00:04:23.697 05:29:34 -- setup/common.sh@33 -- # return 0 00:04:23.697 05:29:34 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:23.697 05:29:34 -- setup/hugepages.sh@112 -- # get_nodes 00:04:23.697 05:29:34 -- setup/hugepages.sh@27 -- # local node 00:04:23.697 05:29:34 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:23.697 05:29:34 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:23.697 05:29:34 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:23.697 05:29:34 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:23.697 05:29:34 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:23.697 05:29:34 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:23.697 05:29:34 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:23.697 05:29:34 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:23.697 05:29:34 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:23.697 05:29:34 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:23.697 05:29:34 -- setup/common.sh@18 -- # local node=0 00:04:23.697 05:29:34 -- setup/common.sh@19 -- # local var val 00:04:23.697 05:29:34 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.697 05:29:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.697 05:29:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:23.697 05:29:34 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:23.697 05:29:34 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.697 05:29:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22149632 kB' 'MemUsed: 10484804 kB' 'SwapCached: 44 kB' 'Active: 5526556 kB' 'Inactive: 535260 kB' 'Active(anon): 4748996 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5804404 kB' 'Mapped: 91660 kB' 'AnonPages: 260660 kB' 'Shmem: 4491596 kB' 'KernelStack: 10056 kB' 'PageTables: 4716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 404920 kB' 'Slab: 888988 kB' 'SReclaimable: 404920 kB' 'SUnreclaim: 484068 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.697 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.697 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # continue 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 05:29:34 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 05:29:34 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.698 05:29:34 -- setup/common.sh@33 -- # echo 0 00:04:23.698 05:29:34 -- setup/common.sh@33 -- # return 0 00:04:23.698 05:29:34 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:23.698 05:29:34 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:23.698 05:29:34 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:23.698 05:29:34 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:23.698 05:29:34 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:23.698 node0=1024 expecting 1024 00:04:23.698 05:29:34 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:23.698 05:29:34 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:23.698 05:29:34 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:23.698 05:29:34 -- setup/hugepages.sh@202 -- # setup output 00:04:23.698 05:29:34 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:23.698 05:29:34 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:26.991 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:26.991 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:26.991 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:27.253 05:29:38 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:27.253 05:29:38 -- setup/hugepages.sh@89 -- # local node 00:04:27.253 05:29:38 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:27.253 05:29:38 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:27.253 05:29:38 -- setup/hugepages.sh@92 -- # local surp 00:04:27.253 05:29:38 -- setup/hugepages.sh@93 -- # local resv 00:04:27.253 05:29:38 -- setup/hugepages.sh@94 -- # local anon 00:04:27.253 05:29:38 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:27.253 05:29:38 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:27.253 05:29:38 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:27.253 05:29:38 -- setup/common.sh@18 -- # local node= 00:04:27.253 05:29:38 -- setup/common.sh@19 -- # local var val 00:04:27.253 05:29:38 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.253 05:29:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.253 05:29:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.253 05:29:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.253 05:29:38 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.253 05:29:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 05:29:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42332052 kB' 'MemAvailable: 43972744 kB' 'Buffers: 6816 kB' 'Cached: 10629740 kB' 'SwapCached: 144 kB' 'Active: 8062920 kB' 'Inactive: 3166072 kB' 'Active(anon): 7155444 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 595500 kB' 'Mapped: 147616 kB' 'Shmem: 8887856 kB' 'KReclaimable: 588792 kB' 'Slab: 1591616 kB' 'SReclaimable: 588792 kB' 'SUnreclaim: 1002824 kB' 'KernelStack: 22080 kB' 'PageTables: 8960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11408632 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218292 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 05:29:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 05:29:38 -- setup/common.sh@33 -- # echo 0 00:04:27.254 05:29:38 -- setup/common.sh@33 -- # return 0 00:04:27.254 05:29:38 -- setup/hugepages.sh@97 -- # anon=0 00:04:27.254 05:29:38 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:27.254 05:29:38 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.254 05:29:38 -- setup/common.sh@18 -- # local node= 00:04:27.254 05:29:38 -- setup/common.sh@19 -- # local var val 00:04:27.254 05:29:38 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.254 05:29:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.254 05:29:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.254 05:29:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.254 05:29:38 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.254 05:29:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42334856 kB' 'MemAvailable: 43975548 kB' 'Buffers: 6816 kB' 'Cached: 10629744 kB' 'SwapCached: 144 kB' 'Active: 8063444 kB' 'Inactive: 3166072 kB' 'Active(anon): 7155968 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 596080 kB' 'Mapped: 147564 kB' 'Shmem: 8887860 kB' 'KReclaimable: 588792 kB' 'Slab: 1591512 kB' 'SReclaimable: 588792 kB' 'SUnreclaim: 1002720 kB' 'KernelStack: 22176 kB' 'PageTables: 9096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11410160 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218244 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 05:29:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 05:29:38 -- setup/common.sh@33 -- # echo 0 00:04:27.256 05:29:38 -- setup/common.sh@33 -- # return 0 00:04:27.256 05:29:38 -- setup/hugepages.sh@99 -- # surp=0 00:04:27.256 05:29:38 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:27.256 05:29:38 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:27.256 05:29:38 -- setup/common.sh@18 -- # local node= 00:04:27.256 05:29:38 -- setup/common.sh@19 -- # local var val 00:04:27.256 05:29:38 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.256 05:29:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.256 05:29:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.256 05:29:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.256 05:29:38 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.256 05:29:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42333720 kB' 'MemAvailable: 43974412 kB' 'Buffers: 6816 kB' 'Cached: 10629756 kB' 'SwapCached: 144 kB' 'Active: 8062964 kB' 'Inactive: 3166072 kB' 'Active(anon): 7155488 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 595576 kB' 'Mapped: 147564 kB' 'Shmem: 8887872 kB' 'KReclaimable: 588792 kB' 'Slab: 1591576 kB' 'SReclaimable: 588792 kB' 'SUnreclaim: 1002784 kB' 'KernelStack: 22048 kB' 'PageTables: 8964 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11406776 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218212 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 05:29:38 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 05:29:38 -- setup/common.sh@33 -- # echo 0 00:04:27.257 05:29:38 -- setup/common.sh@33 -- # return 0 00:04:27.257 05:29:38 -- setup/hugepages.sh@100 -- # resv=0 00:04:27.257 05:29:38 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:27.257 nr_hugepages=1024 00:04:27.257 05:29:38 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:27.257 resv_hugepages=0 00:04:27.257 05:29:38 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:27.257 surplus_hugepages=0 00:04:27.257 05:29:38 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:27.257 anon_hugepages=0 00:04:27.257 05:29:38 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:27.257 05:29:38 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:27.257 05:29:38 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:27.257 05:29:38 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:27.257 05:29:38 -- setup/common.sh@18 -- # local node= 00:04:27.257 05:29:38 -- setup/common.sh@19 -- # local var val 00:04:27.257 05:29:38 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.257 05:29:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.257 05:29:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.257 05:29:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.257 05:29:38 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.257 05:29:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42333216 kB' 'MemAvailable: 43973908 kB' 'Buffers: 6816 kB' 'Cached: 10629768 kB' 'SwapCached: 144 kB' 'Active: 8062484 kB' 'Inactive: 3166072 kB' 'Active(anon): 7155008 kB' 'Inactive(anon): 2324848 kB' 'Active(file): 907476 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 595112 kB' 'Mapped: 147564 kB' 'Shmem: 8887884 kB' 'KReclaimable: 588792 kB' 'Slab: 1591604 kB' 'SReclaimable: 588792 kB' 'SUnreclaim: 1002812 kB' 'KernelStack: 21984 kB' 'PageTables: 8816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11405640 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 05:29:38 -- setup/common.sh@33 -- # echo 1024 00:04:27.259 05:29:38 -- setup/common.sh@33 -- # return 0 00:04:27.259 05:29:38 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:27.259 05:29:38 -- setup/hugepages.sh@112 -- # get_nodes 00:04:27.259 05:29:38 -- setup/hugepages.sh@27 -- # local node 00:04:27.259 05:29:38 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:27.259 05:29:38 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:27.259 05:29:38 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:27.259 05:29:38 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:27.259 05:29:38 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:27.259 05:29:38 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:27.259 05:29:38 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:27.259 05:29:38 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:27.259 05:29:38 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:27.259 05:29:38 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.259 05:29:38 -- setup/common.sh@18 -- # local node=0 00:04:27.259 05:29:38 -- setup/common.sh@19 -- # local var val 00:04:27.259 05:29:38 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.259 05:29:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.259 05:29:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:27.259 05:29:38 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:27.259 05:29:38 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.259 05:29:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.259 05:29:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22138444 kB' 'MemUsed: 10495992 kB' 'SwapCached: 44 kB' 'Active: 5527688 kB' 'Inactive: 535260 kB' 'Active(anon): 4750128 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5804456 kB' 'Mapped: 91656 kB' 'AnonPages: 261724 kB' 'Shmem: 4491648 kB' 'KernelStack: 10184 kB' 'PageTables: 5112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 404920 kB' 'Slab: 888988 kB' 'SReclaimable: 404920 kB' 'SUnreclaim: 484068 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.259 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # continue 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 05:29:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 05:29:38 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 05:29:38 -- setup/common.sh@33 -- # echo 0 00:04:27.260 05:29:38 -- setup/common.sh@33 -- # return 0 00:04:27.260 05:29:38 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:27.260 05:29:38 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:27.260 05:29:38 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:27.260 05:29:38 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:27.260 05:29:38 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:27.260 node0=1024 expecting 1024 00:04:27.260 05:29:38 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:27.260 00:04:27.260 real 0m7.317s 00:04:27.260 user 0m2.680s 00:04:27.260 sys 0m4.760s 00:04:27.260 05:29:38 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:27.260 05:29:38 -- common/autotest_common.sh@10 -- # set +x 00:04:27.260 ************************************ 00:04:27.260 END TEST no_shrink_alloc 00:04:27.260 ************************************ 00:04:27.260 05:29:38 -- setup/hugepages.sh@217 -- # clear_hp 00:04:27.260 05:29:38 -- setup/hugepages.sh@37 -- # local node hp 00:04:27.260 05:29:38 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:27.260 05:29:38 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:27.260 05:29:38 -- setup/hugepages.sh@41 -- # echo 0 00:04:27.260 05:29:38 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:27.260 05:29:38 -- setup/hugepages.sh@41 -- # echo 0 00:04:27.260 05:29:38 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:27.260 05:29:38 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:27.260 05:29:38 -- setup/hugepages.sh@41 -- # echo 0 00:04:27.260 05:29:38 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:27.260 05:29:38 -- setup/hugepages.sh@41 -- # echo 0 00:04:27.260 05:29:38 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:27.260 05:29:38 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:27.260 00:04:27.260 real 0m27.408s 00:04:27.260 user 0m9.606s 00:04:27.260 sys 0m16.799s 00:04:27.260 05:29:38 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:27.260 05:29:38 -- common/autotest_common.sh@10 -- # set +x 00:04:27.260 ************************************ 00:04:27.260 END TEST hugepages 00:04:27.260 ************************************ 00:04:27.519 05:29:38 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:27.519 05:29:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:27.519 05:29:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:27.519 05:29:38 -- common/autotest_common.sh@10 -- # set +x 00:04:27.519 ************************************ 00:04:27.519 START TEST driver 00:04:27.519 ************************************ 00:04:27.519 05:29:38 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:27.519 * Looking for test storage... 00:04:27.519 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:27.519 05:29:38 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:27.519 05:29:38 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:27.519 05:29:38 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:27.519 05:29:38 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:27.519 05:29:38 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:27.519 05:29:38 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:27.519 05:29:38 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:27.519 05:29:38 -- scripts/common.sh@335 -- # IFS=.-: 00:04:27.519 05:29:38 -- scripts/common.sh@335 -- # read -ra ver1 00:04:27.519 05:29:38 -- scripts/common.sh@336 -- # IFS=.-: 00:04:27.519 05:29:38 -- scripts/common.sh@336 -- # read -ra ver2 00:04:27.519 05:29:38 -- scripts/common.sh@337 -- # local 'op=<' 00:04:27.519 05:29:38 -- scripts/common.sh@339 -- # ver1_l=2 00:04:27.519 05:29:38 -- scripts/common.sh@340 -- # ver2_l=1 00:04:27.519 05:29:38 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:27.519 05:29:38 -- scripts/common.sh@343 -- # case "$op" in 00:04:27.519 05:29:38 -- scripts/common.sh@344 -- # : 1 00:04:27.519 05:29:38 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:27.519 05:29:38 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:27.519 05:29:38 -- scripts/common.sh@364 -- # decimal 1 00:04:27.519 05:29:38 -- scripts/common.sh@352 -- # local d=1 00:04:27.519 05:29:38 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:27.519 05:29:38 -- scripts/common.sh@354 -- # echo 1 00:04:27.519 05:29:38 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:27.519 05:29:38 -- scripts/common.sh@365 -- # decimal 2 00:04:27.519 05:29:38 -- scripts/common.sh@352 -- # local d=2 00:04:27.519 05:29:38 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:27.519 05:29:38 -- scripts/common.sh@354 -- # echo 2 00:04:27.519 05:29:38 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:27.519 05:29:38 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:27.519 05:29:38 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:27.519 05:29:38 -- scripts/common.sh@367 -- # return 0 00:04:27.519 05:29:38 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:27.519 05:29:38 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:27.519 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.519 --rc genhtml_branch_coverage=1 00:04:27.519 --rc genhtml_function_coverage=1 00:04:27.519 --rc genhtml_legend=1 00:04:27.519 --rc geninfo_all_blocks=1 00:04:27.519 --rc geninfo_unexecuted_blocks=1 00:04:27.519 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.519 ' 00:04:27.519 05:29:38 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:27.519 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.519 --rc genhtml_branch_coverage=1 00:04:27.519 --rc genhtml_function_coverage=1 00:04:27.519 --rc genhtml_legend=1 00:04:27.519 --rc geninfo_all_blocks=1 00:04:27.519 --rc geninfo_unexecuted_blocks=1 00:04:27.519 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.519 ' 00:04:27.519 05:29:38 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:27.519 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.519 --rc genhtml_branch_coverage=1 00:04:27.519 --rc genhtml_function_coverage=1 00:04:27.519 --rc genhtml_legend=1 00:04:27.519 --rc geninfo_all_blocks=1 00:04:27.519 --rc geninfo_unexecuted_blocks=1 00:04:27.519 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.519 ' 00:04:27.519 05:29:38 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:27.519 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.519 --rc genhtml_branch_coverage=1 00:04:27.520 --rc genhtml_function_coverage=1 00:04:27.520 --rc genhtml_legend=1 00:04:27.520 --rc geninfo_all_blocks=1 00:04:27.520 --rc geninfo_unexecuted_blocks=1 00:04:27.520 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.520 ' 00:04:27.520 05:29:38 -- setup/driver.sh@68 -- # setup reset 00:04:27.520 05:29:38 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:27.520 05:29:38 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:32.792 05:29:43 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:32.792 05:29:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:32.792 05:29:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:32.792 05:29:43 -- common/autotest_common.sh@10 -- # set +x 00:04:32.792 ************************************ 00:04:32.792 START TEST guess_driver 00:04:32.792 ************************************ 00:04:32.792 05:29:43 -- common/autotest_common.sh@1114 -- # guess_driver 00:04:32.792 05:29:43 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:32.792 05:29:43 -- setup/driver.sh@47 -- # local fail=0 00:04:32.792 05:29:43 -- setup/driver.sh@49 -- # pick_driver 00:04:32.792 05:29:43 -- setup/driver.sh@36 -- # vfio 00:04:32.792 05:29:43 -- setup/driver.sh@21 -- # local iommu_grups 00:04:32.792 05:29:43 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:32.792 05:29:43 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:32.792 05:29:43 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:32.792 05:29:43 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:32.792 05:29:43 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:32.792 05:29:43 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:32.792 05:29:43 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:32.792 05:29:43 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:32.792 05:29:43 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:32.792 05:29:43 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:32.792 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:32.792 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:32.792 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:32.792 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:32.792 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:32.792 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:32.792 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:32.792 05:29:43 -- setup/driver.sh@30 -- # return 0 00:04:32.792 05:29:43 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:32.792 05:29:43 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:32.792 05:29:43 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:32.792 05:29:43 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:32.792 Looking for driver=vfio-pci 00:04:32.792 05:29:43 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:32.792 05:29:43 -- setup/driver.sh@45 -- # setup output config 00:04:32.792 05:29:43 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:32.792 05:29:43 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.076 05:29:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.076 05:29:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.076 05:29:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:37.452 05:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:37.452 05:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:37.452 05:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:37.452 05:29:48 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:37.452 05:29:48 -- setup/driver.sh@65 -- # setup reset 00:04:37.452 05:29:48 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:37.453 05:29:48 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:42.732 00:04:42.732 real 0m9.686s 00:04:42.732 user 0m2.478s 00:04:42.732 sys 0m4.953s 00:04:42.732 05:29:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:42.732 05:29:53 -- common/autotest_common.sh@10 -- # set +x 00:04:42.732 ************************************ 00:04:42.732 END TEST guess_driver 00:04:42.732 ************************************ 00:04:42.732 00:04:42.732 real 0m14.775s 00:04:42.732 user 0m3.957s 00:04:42.733 sys 0m7.816s 00:04:42.733 05:29:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:42.733 05:29:53 -- common/autotest_common.sh@10 -- # set +x 00:04:42.733 ************************************ 00:04:42.733 END TEST driver 00:04:42.733 ************************************ 00:04:42.733 05:29:53 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:42.733 05:29:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:42.733 05:29:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:42.733 05:29:53 -- common/autotest_common.sh@10 -- # set +x 00:04:42.733 ************************************ 00:04:42.733 START TEST devices 00:04:42.733 ************************************ 00:04:42.733 05:29:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:42.733 * Looking for test storage... 00:04:42.733 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:42.733 05:29:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:42.733 05:29:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:42.733 05:29:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:42.733 05:29:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:42.733 05:29:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:42.733 05:29:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:42.733 05:29:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:42.733 05:29:53 -- scripts/common.sh@335 -- # IFS=.-: 00:04:42.733 05:29:53 -- scripts/common.sh@335 -- # read -ra ver1 00:04:42.733 05:29:53 -- scripts/common.sh@336 -- # IFS=.-: 00:04:42.733 05:29:53 -- scripts/common.sh@336 -- # read -ra ver2 00:04:42.733 05:29:53 -- scripts/common.sh@337 -- # local 'op=<' 00:04:42.733 05:29:53 -- scripts/common.sh@339 -- # ver1_l=2 00:04:42.733 05:29:53 -- scripts/common.sh@340 -- # ver2_l=1 00:04:42.733 05:29:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:42.733 05:29:53 -- scripts/common.sh@343 -- # case "$op" in 00:04:42.733 05:29:53 -- scripts/common.sh@344 -- # : 1 00:04:42.733 05:29:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:42.733 05:29:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:42.733 05:29:53 -- scripts/common.sh@364 -- # decimal 1 00:04:42.733 05:29:53 -- scripts/common.sh@352 -- # local d=1 00:04:42.733 05:29:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:42.733 05:29:53 -- scripts/common.sh@354 -- # echo 1 00:04:42.733 05:29:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:42.733 05:29:53 -- scripts/common.sh@365 -- # decimal 2 00:04:42.733 05:29:53 -- scripts/common.sh@352 -- # local d=2 00:04:42.733 05:29:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:42.733 05:29:53 -- scripts/common.sh@354 -- # echo 2 00:04:42.733 05:29:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:42.733 05:29:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:42.733 05:29:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:42.733 05:29:53 -- scripts/common.sh@367 -- # return 0 00:04:42.733 05:29:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:42.733 05:29:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:42.733 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.733 --rc genhtml_branch_coverage=1 00:04:42.733 --rc genhtml_function_coverage=1 00:04:42.733 --rc genhtml_legend=1 00:04:42.733 --rc geninfo_all_blocks=1 00:04:42.733 --rc geninfo_unexecuted_blocks=1 00:04:42.733 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.733 ' 00:04:42.733 05:29:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:42.733 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.733 --rc genhtml_branch_coverage=1 00:04:42.733 --rc genhtml_function_coverage=1 00:04:42.733 --rc genhtml_legend=1 00:04:42.733 --rc geninfo_all_blocks=1 00:04:42.733 --rc geninfo_unexecuted_blocks=1 00:04:42.733 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.733 ' 00:04:42.733 05:29:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:42.733 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.733 --rc genhtml_branch_coverage=1 00:04:42.733 --rc genhtml_function_coverage=1 00:04:42.733 --rc genhtml_legend=1 00:04:42.733 --rc geninfo_all_blocks=1 00:04:42.733 --rc geninfo_unexecuted_blocks=1 00:04:42.733 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.733 ' 00:04:42.733 05:29:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:42.733 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.733 --rc genhtml_branch_coverage=1 00:04:42.733 --rc genhtml_function_coverage=1 00:04:42.733 --rc genhtml_legend=1 00:04:42.733 --rc geninfo_all_blocks=1 00:04:42.733 --rc geninfo_unexecuted_blocks=1 00:04:42.733 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.733 ' 00:04:42.733 05:29:53 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:42.733 05:29:53 -- setup/devices.sh@192 -- # setup reset 00:04:42.733 05:29:53 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:42.733 05:29:53 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:46.027 05:29:56 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:46.027 05:29:56 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:04:46.027 05:29:56 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:04:46.027 05:29:56 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:04:46.027 05:29:56 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:04:46.027 05:29:56 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:04:46.027 05:29:56 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:04:46.027 05:29:56 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:46.027 05:29:56 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:04:46.027 05:29:56 -- setup/devices.sh@196 -- # blocks=() 00:04:46.027 05:29:56 -- setup/devices.sh@196 -- # declare -a blocks 00:04:46.027 05:29:56 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:46.027 05:29:56 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:46.027 05:29:56 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:46.027 05:29:56 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:46.027 05:29:56 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:46.027 05:29:56 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:46.027 05:29:56 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:46.027 05:29:56 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:46.027 05:29:56 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:46.027 05:29:56 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:46.027 05:29:56 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:46.027 No valid GPT data, bailing 00:04:46.027 05:29:56 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:46.027 05:29:56 -- scripts/common.sh@393 -- # pt= 00:04:46.027 05:29:56 -- scripts/common.sh@394 -- # return 1 00:04:46.027 05:29:56 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:46.027 05:29:56 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:46.027 05:29:56 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:46.027 05:29:56 -- setup/common.sh@80 -- # echo 1600321314816 00:04:46.027 05:29:56 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:46.027 05:29:56 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:46.027 05:29:56 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:46.027 05:29:56 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:46.027 05:29:56 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:46.027 05:29:56 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:46.027 05:29:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:46.027 05:29:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:46.027 05:29:56 -- common/autotest_common.sh@10 -- # set +x 00:04:46.027 ************************************ 00:04:46.027 START TEST nvme_mount 00:04:46.027 ************************************ 00:04:46.027 05:29:56 -- common/autotest_common.sh@1114 -- # nvme_mount 00:04:46.027 05:29:56 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:46.027 05:29:56 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:46.027 05:29:56 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.027 05:29:56 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.027 05:29:56 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:46.027 05:29:56 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:46.027 05:29:56 -- setup/common.sh@40 -- # local part_no=1 00:04:46.027 05:29:56 -- setup/common.sh@41 -- # local size=1073741824 00:04:46.027 05:29:56 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:46.027 05:29:56 -- setup/common.sh@44 -- # parts=() 00:04:46.027 05:29:56 -- setup/common.sh@44 -- # local parts 00:04:46.027 05:29:56 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:46.027 05:29:56 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:46.027 05:29:56 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:46.027 05:29:56 -- setup/common.sh@46 -- # (( part++ )) 00:04:46.027 05:29:56 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:46.027 05:29:56 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:46.027 05:29:56 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:46.027 05:29:56 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:46.969 Creating new GPT entries in memory. 00:04:46.969 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:46.969 other utilities. 00:04:46.969 05:29:57 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:46.969 05:29:57 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:46.969 05:29:57 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:46.969 05:29:57 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:46.969 05:29:57 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:47.909 Creating new GPT entries in memory. 00:04:47.909 The operation has completed successfully. 00:04:47.909 05:29:58 -- setup/common.sh@57 -- # (( part++ )) 00:04:47.909 05:29:58 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:47.909 05:29:58 -- setup/common.sh@62 -- # wait 2173183 00:04:47.909 05:29:59 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.909 05:29:59 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:47.909 05:29:59 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.909 05:29:59 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:47.909 05:29:59 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:47.909 05:29:59 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.909 05:29:59 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:47.909 05:29:59 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:47.909 05:29:59 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:47.909 05:29:59 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.909 05:29:59 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:47.909 05:29:59 -- setup/devices.sh@53 -- # local found=0 00:04:47.909 05:29:59 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:47.909 05:29:59 -- setup/devices.sh@56 -- # : 00:04:47.909 05:29:59 -- setup/devices.sh@59 -- # local pci status 00:04:47.909 05:29:59 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.909 05:29:59 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:47.909 05:29:59 -- setup/devices.sh@47 -- # setup output config 00:04:47.909 05:29:59 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:47.909 05:29:59 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:51.245 05:30:02 -- setup/devices.sh@63 -- # found=1 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.245 05:30:02 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:51.245 05:30:02 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:51.245 05:30:02 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:51.245 05:30:02 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:51.245 05:30:02 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:51.245 05:30:02 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:51.245 05:30:02 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:51.245 05:30:02 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:51.245 05:30:02 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:51.245 05:30:02 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:51.557 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:51.557 05:30:02 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:51.557 05:30:02 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:51.557 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:51.557 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:51.557 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:51.557 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:51.557 05:30:02 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:51.557 05:30:02 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:51.557 05:30:02 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:51.557 05:30:02 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:51.557 05:30:02 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:51.557 05:30:02 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:51.557 05:30:02 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:51.557 05:30:02 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:51.557 05:30:02 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:51.557 05:30:02 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:51.557 05:30:02 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:51.557 05:30:02 -- setup/devices.sh@53 -- # local found=0 00:04:51.557 05:30:02 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:51.557 05:30:02 -- setup/devices.sh@56 -- # : 00:04:51.557 05:30:02 -- setup/devices.sh@59 -- # local pci status 00:04:51.557 05:30:02 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.557 05:30:02 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:51.557 05:30:02 -- setup/devices.sh@47 -- # setup output config 00:04:51.557 05:30:02 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:51.557 05:30:02 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:54.939 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.939 05:30:06 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:54.939 05:30:06 -- setup/devices.sh@63 -- # found=1 00:04:54.939 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.939 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.939 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.939 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.939 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.939 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.939 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.939 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.939 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.939 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.939 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.939 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.939 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.939 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.939 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.939 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.939 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.939 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.940 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.940 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.940 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.940 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.940 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.940 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.940 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.940 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.940 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.940 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.940 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.940 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.940 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.940 05:30:06 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:54.940 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.199 05:30:06 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:55.199 05:30:06 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:55.199 05:30:06 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:55.199 05:30:06 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:55.199 05:30:06 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:55.199 05:30:06 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:55.199 05:30:06 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:55.199 05:30:06 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:55.199 05:30:06 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:55.199 05:30:06 -- setup/devices.sh@50 -- # local mount_point= 00:04:55.199 05:30:06 -- setup/devices.sh@51 -- # local test_file= 00:04:55.199 05:30:06 -- setup/devices.sh@53 -- # local found=0 00:04:55.199 05:30:06 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:55.199 05:30:06 -- setup/devices.sh@59 -- # local pci status 00:04:55.199 05:30:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.199 05:30:06 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:55.199 05:30:06 -- setup/devices.sh@47 -- # setup output config 00:04:55.199 05:30:06 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.199 05:30:06 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:58.478 05:30:09 -- setup/devices.sh@63 -- # found=1 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.478 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.478 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.479 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.479 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.479 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.479 05:30:09 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.479 05:30:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.479 05:30:09 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:58.479 05:30:09 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:58.479 05:30:09 -- setup/devices.sh@68 -- # return 0 00:04:58.479 05:30:09 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:58.479 05:30:09 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:58.479 05:30:09 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:58.479 05:30:09 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:58.479 05:30:09 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:58.479 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:58.479 00:04:58.479 real 0m12.492s 00:04:58.479 user 0m3.634s 00:04:58.479 sys 0m6.773s 00:04:58.479 05:30:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:58.479 05:30:09 -- common/autotest_common.sh@10 -- # set +x 00:04:58.479 ************************************ 00:04:58.479 END TEST nvme_mount 00:04:58.479 ************************************ 00:04:58.479 05:30:09 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:58.479 05:30:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:58.479 05:30:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:58.479 05:30:09 -- common/autotest_common.sh@10 -- # set +x 00:04:58.479 ************************************ 00:04:58.479 START TEST dm_mount 00:04:58.479 ************************************ 00:04:58.479 05:30:09 -- common/autotest_common.sh@1114 -- # dm_mount 00:04:58.479 05:30:09 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:58.479 05:30:09 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:58.479 05:30:09 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:58.479 05:30:09 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:58.479 05:30:09 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:58.479 05:30:09 -- setup/common.sh@40 -- # local part_no=2 00:04:58.479 05:30:09 -- setup/common.sh@41 -- # local size=1073741824 00:04:58.479 05:30:09 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:58.479 05:30:09 -- setup/common.sh@44 -- # parts=() 00:04:58.479 05:30:09 -- setup/common.sh@44 -- # local parts 00:04:58.479 05:30:09 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:58.479 05:30:09 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:58.479 05:30:09 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:58.479 05:30:09 -- setup/common.sh@46 -- # (( part++ )) 00:04:58.479 05:30:09 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:58.479 05:30:09 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:58.479 05:30:09 -- setup/common.sh@46 -- # (( part++ )) 00:04:58.479 05:30:09 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:58.479 05:30:09 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:58.479 05:30:09 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:58.479 05:30:09 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:59.409 Creating new GPT entries in memory. 00:04:59.409 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:59.409 other utilities. 00:04:59.409 05:30:10 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:59.409 05:30:10 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:59.409 05:30:10 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:59.409 05:30:10 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:59.409 05:30:10 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:00.346 Creating new GPT entries in memory. 00:05:00.346 The operation has completed successfully. 00:05:00.346 05:30:11 -- setup/common.sh@57 -- # (( part++ )) 00:05:00.346 05:30:11 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:00.346 05:30:11 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:00.346 05:30:11 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:00.346 05:30:11 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:01.283 The operation has completed successfully. 00:05:01.283 05:30:12 -- setup/common.sh@57 -- # (( part++ )) 00:05:01.283 05:30:12 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:01.283 05:30:12 -- setup/common.sh@62 -- # wait 2178272 00:05:01.542 05:30:12 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:01.542 05:30:12 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:01.542 05:30:12 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:01.542 05:30:12 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:01.542 05:30:12 -- setup/devices.sh@160 -- # for t in {1..5} 00:05:01.542 05:30:12 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:01.542 05:30:12 -- setup/devices.sh@161 -- # break 00:05:01.542 05:30:12 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:01.542 05:30:12 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:01.542 05:30:12 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:01.543 05:30:12 -- setup/devices.sh@166 -- # dm=dm-0 00:05:01.543 05:30:12 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:01.543 05:30:12 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:01.543 05:30:12 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:01.543 05:30:12 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:01.543 05:30:12 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:01.543 05:30:12 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:01.543 05:30:12 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:01.543 05:30:12 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:01.543 05:30:12 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:01.543 05:30:12 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:01.543 05:30:12 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:01.543 05:30:12 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:01.543 05:30:12 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:01.543 05:30:12 -- setup/devices.sh@53 -- # local found=0 00:05:01.543 05:30:12 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:01.543 05:30:12 -- setup/devices.sh@56 -- # : 00:05:01.543 05:30:12 -- setup/devices.sh@59 -- # local pci status 00:05:01.543 05:30:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.543 05:30:12 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:01.543 05:30:12 -- setup/devices.sh@47 -- # setup output config 00:05:01.543 05:30:12 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:01.543 05:30:12 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:04.828 05:30:15 -- setup/devices.sh@63 -- # found=1 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.828 05:30:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:16 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:04.828 05:30:16 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:04.828 05:30:16 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:04.828 05:30:16 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:04.828 05:30:16 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:04.828 05:30:16 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:04.828 05:30:16 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:04.828 05:30:16 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:04.828 05:30:16 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:04.828 05:30:16 -- setup/devices.sh@50 -- # local mount_point= 00:05:04.828 05:30:16 -- setup/devices.sh@51 -- # local test_file= 00:05:04.828 05:30:16 -- setup/devices.sh@53 -- # local found=0 00:05:04.828 05:30:16 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:04.828 05:30:16 -- setup/devices.sh@59 -- # local pci status 00:05:04.828 05:30:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.828 05:30:16 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:04.828 05:30:16 -- setup/devices.sh@47 -- # setup output config 00:05:04.828 05:30:16 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.828 05:30:16 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:08.115 05:30:18 -- setup/devices.sh@63 -- # found=1 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.115 05:30:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.115 05:30:18 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:08.115 05:30:18 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:08.115 05:30:18 -- setup/devices.sh@68 -- # return 0 00:05:08.115 05:30:18 -- setup/devices.sh@187 -- # cleanup_dm 00:05:08.115 05:30:18 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:08.115 05:30:18 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:08.115 05:30:18 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:08.115 05:30:18 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:08.116 05:30:18 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:08.116 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:08.116 05:30:18 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:08.116 05:30:18 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:08.116 00:05:08.116 real 0m9.542s 00:05:08.116 user 0m2.134s 00:05:08.116 sys 0m4.317s 00:05:08.116 05:30:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:08.116 05:30:19 -- common/autotest_common.sh@10 -- # set +x 00:05:08.116 ************************************ 00:05:08.116 END TEST dm_mount 00:05:08.116 ************************************ 00:05:08.116 05:30:19 -- setup/devices.sh@1 -- # cleanup 00:05:08.116 05:30:19 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:08.116 05:30:19 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:08.116 05:30:19 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:08.116 05:30:19 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:08.116 05:30:19 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:08.116 05:30:19 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:08.116 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:08.116 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:08.116 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:08.116 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:08.116 05:30:19 -- setup/devices.sh@12 -- # cleanup_dm 00:05:08.116 05:30:19 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:08.116 05:30:19 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:08.116 05:30:19 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:08.116 05:30:19 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:08.116 05:30:19 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:08.116 05:30:19 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:08.116 00:05:08.116 real 0m25.937s 00:05:08.116 user 0m6.971s 00:05:08.116 sys 0m13.638s 00:05:08.116 05:30:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:08.116 05:30:19 -- common/autotest_common.sh@10 -- # set +x 00:05:08.116 ************************************ 00:05:08.116 END TEST devices 00:05:08.116 ************************************ 00:05:08.116 00:05:08.116 real 1m32.824s 00:05:08.116 user 0m28.526s 00:05:08.116 sys 0m53.179s 00:05:08.116 05:30:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:08.116 05:30:19 -- common/autotest_common.sh@10 -- # set +x 00:05:08.116 ************************************ 00:05:08.116 END TEST setup.sh 00:05:08.116 ************************************ 00:05:08.375 05:30:19 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:11.665 Hugepages 00:05:11.665 node hugesize free / total 00:05:11.665 node0 1048576kB 0 / 0 00:05:11.665 node0 2048kB 2048 / 2048 00:05:11.665 node1 1048576kB 0 / 0 00:05:11.665 node1 2048kB 0 / 0 00:05:11.665 00:05:11.665 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:11.665 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:11.665 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:11.665 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:11.665 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:11.665 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:11.665 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:11.665 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:11.665 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:11.665 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:11.665 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:11.665 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:11.665 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:11.665 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:11.665 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:11.665 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:11.665 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:11.665 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:11.665 05:30:22 -- spdk/autotest.sh@128 -- # uname -s 00:05:11.665 05:30:22 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:05:11.665 05:30:22 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:05:11.665 05:30:22 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:14.957 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:14.957 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:16.337 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:16.597 05:30:27 -- common/autotest_common.sh@1527 -- # sleep 1 00:05:17.534 05:30:28 -- common/autotest_common.sh@1528 -- # bdfs=() 00:05:17.534 05:30:28 -- common/autotest_common.sh@1528 -- # local bdfs 00:05:17.534 05:30:28 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:05:17.534 05:30:28 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:05:17.534 05:30:28 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:17.534 05:30:28 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:17.534 05:30:28 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:17.534 05:30:28 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:17.534 05:30:28 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:17.534 05:30:28 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:17.534 05:30:28 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:17.534 05:30:28 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:20.823 Waiting for block devices as requested 00:05:20.823 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:20.823 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:21.081 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:21.081 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:21.081 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:21.081 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:21.340 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:21.340 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:21.340 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:21.598 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:21.598 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:21.598 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:21.857 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:21.857 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:21.857 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:22.115 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:22.115 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:22.374 05:30:33 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:05:22.374 05:30:33 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:22.374 05:30:33 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:05:22.374 05:30:33 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:05:22.374 05:30:33 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:22.374 05:30:33 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:22.374 05:30:33 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:22.374 05:30:33 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:05:22.374 05:30:33 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:05:22.374 05:30:33 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:05:22.374 05:30:33 -- common/autotest_common.sh@1540 -- # grep oacs 00:05:22.374 05:30:33 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:22.374 05:30:33 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:22.374 05:30:33 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:05:22.374 05:30:33 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:05:22.374 05:30:33 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:05:22.374 05:30:33 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:05:22.374 05:30:33 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:05:22.374 05:30:33 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:05:22.374 05:30:33 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:05:22.374 05:30:33 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:05:22.374 05:30:33 -- common/autotest_common.sh@1552 -- # continue 00:05:22.374 05:30:33 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:05:22.374 05:30:33 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:22.374 05:30:33 -- common/autotest_common.sh@10 -- # set +x 00:05:22.374 05:30:33 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:05:22.374 05:30:33 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:22.374 05:30:33 -- common/autotest_common.sh@10 -- # set +x 00:05:22.374 05:30:33 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:25.657 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:25.657 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:27.032 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:27.032 05:30:38 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:05:27.032 05:30:38 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:27.032 05:30:38 -- common/autotest_common.sh@10 -- # set +x 00:05:27.032 05:30:38 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:05:27.032 05:30:38 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:05:27.032 05:30:38 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:05:27.032 05:30:38 -- common/autotest_common.sh@1572 -- # bdfs=() 00:05:27.032 05:30:38 -- common/autotest_common.sh@1572 -- # local bdfs 00:05:27.032 05:30:38 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:05:27.032 05:30:38 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:27.032 05:30:38 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:27.032 05:30:38 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:27.032 05:30:38 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:27.032 05:30:38 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:27.032 05:30:38 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:27.032 05:30:38 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:27.032 05:30:38 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:05:27.032 05:30:38 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:27.032 05:30:38 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:05:27.032 05:30:38 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:27.032 05:30:38 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:05:27.032 05:30:38 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:05:27.032 05:30:38 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:05:27.032 05:30:38 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=2187949 00:05:27.032 05:30:38 -- common/autotest_common.sh@1593 -- # waitforlisten 2187949 00:05:27.032 05:30:38 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:27.032 05:30:38 -- common/autotest_common.sh@829 -- # '[' -z 2187949 ']' 00:05:27.032 05:30:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:27.032 05:30:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:27.032 05:30:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:27.032 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:27.032 05:30:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:27.032 05:30:38 -- common/autotest_common.sh@10 -- # set +x 00:05:27.032 [2024-11-29 05:30:38.282647] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:27.032 [2024-11-29 05:30:38.282733] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2187949 ] 00:05:27.032 EAL: No free 2048 kB hugepages reported on node 1 00:05:27.290 [2024-11-29 05:30:38.352789] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:27.291 [2024-11-29 05:30:38.394957] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:27.291 [2024-11-29 05:30:38.395086] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:27.857 05:30:39 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:27.857 05:30:39 -- common/autotest_common.sh@862 -- # return 0 00:05:27.857 05:30:39 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:05:27.857 05:30:39 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:05:27.857 05:30:39 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:31.141 nvme0n1 00:05:31.141 05:30:42 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:31.141 [2024-11-29 05:30:42.286561] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:31.141 request: 00:05:31.141 { 00:05:31.141 "nvme_ctrlr_name": "nvme0", 00:05:31.141 "password": "test", 00:05:31.141 "method": "bdev_nvme_opal_revert", 00:05:31.141 "req_id": 1 00:05:31.141 } 00:05:31.141 Got JSON-RPC error response 00:05:31.141 response: 00:05:31.141 { 00:05:31.141 "code": -32602, 00:05:31.141 "message": "Invalid parameters" 00:05:31.141 } 00:05:31.141 05:30:42 -- common/autotest_common.sh@1599 -- # true 00:05:31.141 05:30:42 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:05:31.141 05:30:42 -- common/autotest_common.sh@1603 -- # killprocess 2187949 00:05:31.141 05:30:42 -- common/autotest_common.sh@936 -- # '[' -z 2187949 ']' 00:05:31.141 05:30:42 -- common/autotest_common.sh@940 -- # kill -0 2187949 00:05:31.141 05:30:42 -- common/autotest_common.sh@941 -- # uname 00:05:31.141 05:30:42 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:31.141 05:30:42 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2187949 00:05:31.141 05:30:42 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:31.141 05:30:42 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:31.142 05:30:42 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2187949' 00:05:31.142 killing process with pid 2187949 00:05:31.142 05:30:42 -- common/autotest_common.sh@955 -- # kill 2187949 00:05:31.142 05:30:42 -- common/autotest_common.sh@960 -- # wait 2187949 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.142 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.143 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.143 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.143 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.143 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.143 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.401 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.402 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:33.303 05:30:44 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:05:33.303 05:30:44 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:05:33.303 05:30:44 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:33.303 05:30:44 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:33.303 05:30:44 -- spdk/autotest.sh@160 -- # timing_enter lib 00:05:33.303 05:30:44 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:33.303 05:30:44 -- common/autotest_common.sh@10 -- # set +x 00:05:33.303 05:30:44 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:33.303 05:30:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:33.303 05:30:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:33.303 05:30:44 -- common/autotest_common.sh@10 -- # set +x 00:05:33.303 ************************************ 00:05:33.303 START TEST env 00:05:33.303 ************************************ 00:05:33.303 05:30:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:33.562 * Looking for test storage... 00:05:33.562 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:33.562 05:30:44 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:33.562 05:30:44 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:33.562 05:30:44 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:33.562 05:30:44 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:33.562 05:30:44 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:33.562 05:30:44 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:33.562 05:30:44 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:33.562 05:30:44 -- scripts/common.sh@335 -- # IFS=.-: 00:05:33.562 05:30:44 -- scripts/common.sh@335 -- # read -ra ver1 00:05:33.562 05:30:44 -- scripts/common.sh@336 -- # IFS=.-: 00:05:33.562 05:30:44 -- scripts/common.sh@336 -- # read -ra ver2 00:05:33.562 05:30:44 -- scripts/common.sh@337 -- # local 'op=<' 00:05:33.562 05:30:44 -- scripts/common.sh@339 -- # ver1_l=2 00:05:33.562 05:30:44 -- scripts/common.sh@340 -- # ver2_l=1 00:05:33.562 05:30:44 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:33.562 05:30:44 -- scripts/common.sh@343 -- # case "$op" in 00:05:33.562 05:30:44 -- scripts/common.sh@344 -- # : 1 00:05:33.562 05:30:44 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:33.562 05:30:44 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:33.562 05:30:44 -- scripts/common.sh@364 -- # decimal 1 00:05:33.562 05:30:44 -- scripts/common.sh@352 -- # local d=1 00:05:33.562 05:30:44 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:33.562 05:30:44 -- scripts/common.sh@354 -- # echo 1 00:05:33.562 05:30:44 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:33.562 05:30:44 -- scripts/common.sh@365 -- # decimal 2 00:05:33.562 05:30:44 -- scripts/common.sh@352 -- # local d=2 00:05:33.562 05:30:44 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:33.562 05:30:44 -- scripts/common.sh@354 -- # echo 2 00:05:33.562 05:30:44 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:33.562 05:30:44 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:33.562 05:30:44 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:33.562 05:30:44 -- scripts/common.sh@367 -- # return 0 00:05:33.562 05:30:44 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:33.562 05:30:44 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:33.562 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.562 --rc genhtml_branch_coverage=1 00:05:33.562 --rc genhtml_function_coverage=1 00:05:33.562 --rc genhtml_legend=1 00:05:33.562 --rc geninfo_all_blocks=1 00:05:33.562 --rc geninfo_unexecuted_blocks=1 00:05:33.562 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.562 ' 00:05:33.562 05:30:44 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:33.562 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.562 --rc genhtml_branch_coverage=1 00:05:33.562 --rc genhtml_function_coverage=1 00:05:33.562 --rc genhtml_legend=1 00:05:33.562 --rc geninfo_all_blocks=1 00:05:33.562 --rc geninfo_unexecuted_blocks=1 00:05:33.562 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.562 ' 00:05:33.562 05:30:44 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:33.562 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.562 --rc genhtml_branch_coverage=1 00:05:33.562 --rc genhtml_function_coverage=1 00:05:33.562 --rc genhtml_legend=1 00:05:33.562 --rc geninfo_all_blocks=1 00:05:33.562 --rc geninfo_unexecuted_blocks=1 00:05:33.562 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.562 ' 00:05:33.563 05:30:44 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:33.563 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.563 --rc genhtml_branch_coverage=1 00:05:33.563 --rc genhtml_function_coverage=1 00:05:33.563 --rc genhtml_legend=1 00:05:33.563 --rc geninfo_all_blocks=1 00:05:33.563 --rc geninfo_unexecuted_blocks=1 00:05:33.563 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.563 ' 00:05:33.563 05:30:44 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:33.563 05:30:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:33.563 05:30:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:33.563 05:30:44 -- common/autotest_common.sh@10 -- # set +x 00:05:33.563 ************************************ 00:05:33.563 START TEST env_memory 00:05:33.563 ************************************ 00:05:33.563 05:30:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:33.563 00:05:33.563 00:05:33.563 CUnit - A unit testing framework for C - Version 2.1-3 00:05:33.563 http://cunit.sourceforge.net/ 00:05:33.563 00:05:33.563 00:05:33.563 Suite: memory 00:05:33.563 Test: alloc and free memory map ...[2024-11-29 05:30:44.803779] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:33.563 passed 00:05:33.563 Test: mem map translation ...[2024-11-29 05:30:44.816503] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:33.563 [2024-11-29 05:30:44.816520] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:33.563 [2024-11-29 05:30:44.816564] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:33.563 [2024-11-29 05:30:44.816572] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:33.563 passed 00:05:33.563 Test: mem map registration ...[2024-11-29 05:30:44.836047] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:33.563 [2024-11-29 05:30:44.836067] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:33.563 passed 00:05:33.563 Test: mem map adjacent registrations ...passed 00:05:33.563 00:05:33.563 Run Summary: Type Total Ran Passed Failed Inactive 00:05:33.563 suites 1 1 n/a 0 0 00:05:33.563 tests 4 4 4 0 0 00:05:33.563 asserts 152 152 152 0 n/a 00:05:33.563 00:05:33.563 Elapsed time = 0.081 seconds 00:05:33.823 00:05:33.823 real 0m0.094s 00:05:33.823 user 0m0.082s 00:05:33.823 sys 0m0.012s 00:05:33.823 05:30:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:33.823 05:30:44 -- common/autotest_common.sh@10 -- # set +x 00:05:33.823 ************************************ 00:05:33.823 END TEST env_memory 00:05:33.823 ************************************ 00:05:33.823 05:30:44 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:33.823 05:30:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:33.823 05:30:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:33.823 05:30:44 -- common/autotest_common.sh@10 -- # set +x 00:05:33.823 ************************************ 00:05:33.823 START TEST env_vtophys 00:05:33.823 ************************************ 00:05:33.823 05:30:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:33.823 EAL: lib.eal log level changed from notice to debug 00:05:33.823 EAL: Detected lcore 0 as core 0 on socket 0 00:05:33.823 EAL: Detected lcore 1 as core 1 on socket 0 00:05:33.823 EAL: Detected lcore 2 as core 2 on socket 0 00:05:33.823 EAL: Detected lcore 3 as core 3 on socket 0 00:05:33.823 EAL: Detected lcore 4 as core 4 on socket 0 00:05:33.823 EAL: Detected lcore 5 as core 5 on socket 0 00:05:33.823 EAL: Detected lcore 6 as core 6 on socket 0 00:05:33.823 EAL: Detected lcore 7 as core 8 on socket 0 00:05:33.823 EAL: Detected lcore 8 as core 9 on socket 0 00:05:33.823 EAL: Detected lcore 9 as core 10 on socket 0 00:05:33.823 EAL: Detected lcore 10 as core 11 on socket 0 00:05:33.823 EAL: Detected lcore 11 as core 12 on socket 0 00:05:33.823 EAL: Detected lcore 12 as core 13 on socket 0 00:05:33.823 EAL: Detected lcore 13 as core 14 on socket 0 00:05:33.823 EAL: Detected lcore 14 as core 16 on socket 0 00:05:33.823 EAL: Detected lcore 15 as core 17 on socket 0 00:05:33.823 EAL: Detected lcore 16 as core 18 on socket 0 00:05:33.823 EAL: Detected lcore 17 as core 19 on socket 0 00:05:33.823 EAL: Detected lcore 18 as core 20 on socket 0 00:05:33.823 EAL: Detected lcore 19 as core 21 on socket 0 00:05:33.823 EAL: Detected lcore 20 as core 22 on socket 0 00:05:33.823 EAL: Detected lcore 21 as core 24 on socket 0 00:05:33.823 EAL: Detected lcore 22 as core 25 on socket 0 00:05:33.823 EAL: Detected lcore 23 as core 26 on socket 0 00:05:33.823 EAL: Detected lcore 24 as core 27 on socket 0 00:05:33.823 EAL: Detected lcore 25 as core 28 on socket 0 00:05:33.823 EAL: Detected lcore 26 as core 29 on socket 0 00:05:33.823 EAL: Detected lcore 27 as core 30 on socket 0 00:05:33.823 EAL: Detected lcore 28 as core 0 on socket 1 00:05:33.823 EAL: Detected lcore 29 as core 1 on socket 1 00:05:33.823 EAL: Detected lcore 30 as core 2 on socket 1 00:05:33.823 EAL: Detected lcore 31 as core 3 on socket 1 00:05:33.823 EAL: Detected lcore 32 as core 4 on socket 1 00:05:33.823 EAL: Detected lcore 33 as core 5 on socket 1 00:05:33.823 EAL: Detected lcore 34 as core 6 on socket 1 00:05:33.823 EAL: Detected lcore 35 as core 8 on socket 1 00:05:33.823 EAL: Detected lcore 36 as core 9 on socket 1 00:05:33.823 EAL: Detected lcore 37 as core 10 on socket 1 00:05:33.823 EAL: Detected lcore 38 as core 11 on socket 1 00:05:33.823 EAL: Detected lcore 39 as core 12 on socket 1 00:05:33.823 EAL: Detected lcore 40 as core 13 on socket 1 00:05:33.823 EAL: Detected lcore 41 as core 14 on socket 1 00:05:33.823 EAL: Detected lcore 42 as core 16 on socket 1 00:05:33.823 EAL: Detected lcore 43 as core 17 on socket 1 00:05:33.823 EAL: Detected lcore 44 as core 18 on socket 1 00:05:33.823 EAL: Detected lcore 45 as core 19 on socket 1 00:05:33.823 EAL: Detected lcore 46 as core 20 on socket 1 00:05:33.823 EAL: Detected lcore 47 as core 21 on socket 1 00:05:33.823 EAL: Detected lcore 48 as core 22 on socket 1 00:05:33.823 EAL: Detected lcore 49 as core 24 on socket 1 00:05:33.823 EAL: Detected lcore 50 as core 25 on socket 1 00:05:33.823 EAL: Detected lcore 51 as core 26 on socket 1 00:05:33.823 EAL: Detected lcore 52 as core 27 on socket 1 00:05:33.823 EAL: Detected lcore 53 as core 28 on socket 1 00:05:33.823 EAL: Detected lcore 54 as core 29 on socket 1 00:05:33.823 EAL: Detected lcore 55 as core 30 on socket 1 00:05:33.823 EAL: Detected lcore 56 as core 0 on socket 0 00:05:33.823 EAL: Detected lcore 57 as core 1 on socket 0 00:05:33.823 EAL: Detected lcore 58 as core 2 on socket 0 00:05:33.823 EAL: Detected lcore 59 as core 3 on socket 0 00:05:33.823 EAL: Detected lcore 60 as core 4 on socket 0 00:05:33.823 EAL: Detected lcore 61 as core 5 on socket 0 00:05:33.823 EAL: Detected lcore 62 as core 6 on socket 0 00:05:33.823 EAL: Detected lcore 63 as core 8 on socket 0 00:05:33.823 EAL: Detected lcore 64 as core 9 on socket 0 00:05:33.823 EAL: Detected lcore 65 as core 10 on socket 0 00:05:33.823 EAL: Detected lcore 66 as core 11 on socket 0 00:05:33.823 EAL: Detected lcore 67 as core 12 on socket 0 00:05:33.823 EAL: Detected lcore 68 as core 13 on socket 0 00:05:33.823 EAL: Detected lcore 69 as core 14 on socket 0 00:05:33.823 EAL: Detected lcore 70 as core 16 on socket 0 00:05:33.823 EAL: Detected lcore 71 as core 17 on socket 0 00:05:33.823 EAL: Detected lcore 72 as core 18 on socket 0 00:05:33.823 EAL: Detected lcore 73 as core 19 on socket 0 00:05:33.823 EAL: Detected lcore 74 as core 20 on socket 0 00:05:33.823 EAL: Detected lcore 75 as core 21 on socket 0 00:05:33.823 EAL: Detected lcore 76 as core 22 on socket 0 00:05:33.823 EAL: Detected lcore 77 as core 24 on socket 0 00:05:33.823 EAL: Detected lcore 78 as core 25 on socket 0 00:05:33.823 EAL: Detected lcore 79 as core 26 on socket 0 00:05:33.823 EAL: Detected lcore 80 as core 27 on socket 0 00:05:33.823 EAL: Detected lcore 81 as core 28 on socket 0 00:05:33.823 EAL: Detected lcore 82 as core 29 on socket 0 00:05:33.823 EAL: Detected lcore 83 as core 30 on socket 0 00:05:33.823 EAL: Detected lcore 84 as core 0 on socket 1 00:05:33.823 EAL: Detected lcore 85 as core 1 on socket 1 00:05:33.823 EAL: Detected lcore 86 as core 2 on socket 1 00:05:33.823 EAL: Detected lcore 87 as core 3 on socket 1 00:05:33.823 EAL: Detected lcore 88 as core 4 on socket 1 00:05:33.823 EAL: Detected lcore 89 as core 5 on socket 1 00:05:33.823 EAL: Detected lcore 90 as core 6 on socket 1 00:05:33.823 EAL: Detected lcore 91 as core 8 on socket 1 00:05:33.823 EAL: Detected lcore 92 as core 9 on socket 1 00:05:33.823 EAL: Detected lcore 93 as core 10 on socket 1 00:05:33.823 EAL: Detected lcore 94 as core 11 on socket 1 00:05:33.823 EAL: Detected lcore 95 as core 12 on socket 1 00:05:33.823 EAL: Detected lcore 96 as core 13 on socket 1 00:05:33.823 EAL: Detected lcore 97 as core 14 on socket 1 00:05:33.823 EAL: Detected lcore 98 as core 16 on socket 1 00:05:33.823 EAL: Detected lcore 99 as core 17 on socket 1 00:05:33.823 EAL: Detected lcore 100 as core 18 on socket 1 00:05:33.823 EAL: Detected lcore 101 as core 19 on socket 1 00:05:33.823 EAL: Detected lcore 102 as core 20 on socket 1 00:05:33.823 EAL: Detected lcore 103 as core 21 on socket 1 00:05:33.823 EAL: Detected lcore 104 as core 22 on socket 1 00:05:33.823 EAL: Detected lcore 105 as core 24 on socket 1 00:05:33.823 EAL: Detected lcore 106 as core 25 on socket 1 00:05:33.823 EAL: Detected lcore 107 as core 26 on socket 1 00:05:33.823 EAL: Detected lcore 108 as core 27 on socket 1 00:05:33.823 EAL: Detected lcore 109 as core 28 on socket 1 00:05:33.824 EAL: Detected lcore 110 as core 29 on socket 1 00:05:33.824 EAL: Detected lcore 111 as core 30 on socket 1 00:05:33.824 EAL: Maximum logical cores by configuration: 128 00:05:33.824 EAL: Detected CPU lcores: 112 00:05:33.824 EAL: Detected NUMA nodes: 2 00:05:33.824 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:05:33.824 EAL: Checking presence of .so 'librte_eal.so.23' 00:05:33.824 EAL: Checking presence of .so 'librte_eal.so' 00:05:33.824 EAL: Detected static linkage of DPDK 00:05:33.824 EAL: No shared files mode enabled, IPC will be disabled 00:05:33.824 EAL: Bus pci wants IOVA as 'DC' 00:05:33.824 EAL: Buses did not request a specific IOVA mode. 00:05:33.824 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:33.824 EAL: Selected IOVA mode 'VA' 00:05:33.824 EAL: No free 2048 kB hugepages reported on node 1 00:05:33.824 EAL: Probing VFIO support... 00:05:33.824 EAL: IOMMU type 1 (Type 1) is supported 00:05:33.824 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:33.824 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:33.824 EAL: VFIO support initialized 00:05:33.824 EAL: Ask a virtual area of 0x2e000 bytes 00:05:33.824 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:33.824 EAL: Setting up physically contiguous memory... 00:05:33.824 EAL: Setting maximum number of open files to 524288 00:05:33.824 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:33.824 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:33.824 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:33.824 EAL: Ask a virtual area of 0x61000 bytes 00:05:33.824 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:33.824 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:33.824 EAL: Ask a virtual area of 0x400000000 bytes 00:05:33.824 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:33.824 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:33.824 EAL: Ask a virtual area of 0x61000 bytes 00:05:33.824 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:33.824 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:33.824 EAL: Ask a virtual area of 0x400000000 bytes 00:05:33.824 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:33.824 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:33.824 EAL: Ask a virtual area of 0x61000 bytes 00:05:33.824 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:33.824 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:33.824 EAL: Ask a virtual area of 0x400000000 bytes 00:05:33.824 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:33.824 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:33.824 EAL: Ask a virtual area of 0x61000 bytes 00:05:33.824 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:33.824 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:33.824 EAL: Ask a virtual area of 0x400000000 bytes 00:05:33.824 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:33.824 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:33.824 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:33.824 EAL: Ask a virtual area of 0x61000 bytes 00:05:33.824 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:33.824 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:33.824 EAL: Ask a virtual area of 0x400000000 bytes 00:05:33.824 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:33.824 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:33.824 EAL: Ask a virtual area of 0x61000 bytes 00:05:33.824 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:33.824 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:33.824 EAL: Ask a virtual area of 0x400000000 bytes 00:05:33.824 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:33.824 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:33.824 EAL: Ask a virtual area of 0x61000 bytes 00:05:33.824 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:33.824 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:33.824 EAL: Ask a virtual area of 0x400000000 bytes 00:05:33.824 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:33.824 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:33.824 EAL: Ask a virtual area of 0x61000 bytes 00:05:33.824 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:33.824 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:33.824 EAL: Ask a virtual area of 0x400000000 bytes 00:05:33.824 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:33.824 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:33.824 EAL: Hugepages will be freed exactly as allocated. 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: TSC frequency is ~2500000 KHz 00:05:33.824 EAL: Main lcore 0 is ready (tid=7fdc2a81ca00;cpuset=[0]) 00:05:33.824 EAL: Trying to obtain current memory policy. 00:05:33.824 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:33.824 EAL: Restoring previous memory policy: 0 00:05:33.824 EAL: request: mp_malloc_sync 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: Heap on socket 0 was expanded by 2MB 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: Mem event callback 'spdk:(nil)' registered 00:05:33.824 00:05:33.824 00:05:33.824 CUnit - A unit testing framework for C - Version 2.1-3 00:05:33.824 http://cunit.sourceforge.net/ 00:05:33.824 00:05:33.824 00:05:33.824 Suite: components_suite 00:05:33.824 Test: vtophys_malloc_test ...passed 00:05:33.824 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:33.824 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:33.824 EAL: Restoring previous memory policy: 4 00:05:33.824 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.824 EAL: request: mp_malloc_sync 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: Heap on socket 0 was expanded by 4MB 00:05:33.824 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.824 EAL: request: mp_malloc_sync 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: Heap on socket 0 was shrunk by 4MB 00:05:33.824 EAL: Trying to obtain current memory policy. 00:05:33.824 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:33.824 EAL: Restoring previous memory policy: 4 00:05:33.824 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.824 EAL: request: mp_malloc_sync 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: Heap on socket 0 was expanded by 6MB 00:05:33.824 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.824 EAL: request: mp_malloc_sync 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: Heap on socket 0 was shrunk by 6MB 00:05:33.824 EAL: Trying to obtain current memory policy. 00:05:33.824 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:33.824 EAL: Restoring previous memory policy: 4 00:05:33.824 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.824 EAL: request: mp_malloc_sync 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: Heap on socket 0 was expanded by 10MB 00:05:33.824 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.824 EAL: request: mp_malloc_sync 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: Heap on socket 0 was shrunk by 10MB 00:05:33.824 EAL: Trying to obtain current memory policy. 00:05:33.824 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:33.824 EAL: Restoring previous memory policy: 4 00:05:33.824 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.824 EAL: request: mp_malloc_sync 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: Heap on socket 0 was expanded by 18MB 00:05:33.824 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.824 EAL: request: mp_malloc_sync 00:05:33.824 EAL: No shared files mode enabled, IPC is disabled 00:05:33.824 EAL: Heap on socket 0 was shrunk by 18MB 00:05:33.824 EAL: Trying to obtain current memory policy. 00:05:33.825 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:33.825 EAL: Restoring previous memory policy: 4 00:05:33.825 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.825 EAL: request: mp_malloc_sync 00:05:33.825 EAL: No shared files mode enabled, IPC is disabled 00:05:33.825 EAL: Heap on socket 0 was expanded by 34MB 00:05:33.825 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.825 EAL: request: mp_malloc_sync 00:05:33.825 EAL: No shared files mode enabled, IPC is disabled 00:05:33.825 EAL: Heap on socket 0 was shrunk by 34MB 00:05:33.825 EAL: Trying to obtain current memory policy. 00:05:33.825 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:33.825 EAL: Restoring previous memory policy: 4 00:05:33.825 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.825 EAL: request: mp_malloc_sync 00:05:33.825 EAL: No shared files mode enabled, IPC is disabled 00:05:33.825 EAL: Heap on socket 0 was expanded by 66MB 00:05:33.825 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.825 EAL: request: mp_malloc_sync 00:05:33.825 EAL: No shared files mode enabled, IPC is disabled 00:05:33.825 EAL: Heap on socket 0 was shrunk by 66MB 00:05:33.825 EAL: Trying to obtain current memory policy. 00:05:33.825 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:33.825 EAL: Restoring previous memory policy: 4 00:05:33.825 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.825 EAL: request: mp_malloc_sync 00:05:33.825 EAL: No shared files mode enabled, IPC is disabled 00:05:33.825 EAL: Heap on socket 0 was expanded by 130MB 00:05:33.825 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.825 EAL: request: mp_malloc_sync 00:05:33.825 EAL: No shared files mode enabled, IPC is disabled 00:05:33.825 EAL: Heap on socket 0 was shrunk by 130MB 00:05:33.825 EAL: Trying to obtain current memory policy. 00:05:33.825 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:34.083 EAL: Restoring previous memory policy: 4 00:05:34.083 EAL: Calling mem event callback 'spdk:(nil)' 00:05:34.083 EAL: request: mp_malloc_sync 00:05:34.083 EAL: No shared files mode enabled, IPC is disabled 00:05:34.083 EAL: Heap on socket 0 was expanded by 258MB 00:05:34.083 EAL: Calling mem event callback 'spdk:(nil)' 00:05:34.083 EAL: request: mp_malloc_sync 00:05:34.083 EAL: No shared files mode enabled, IPC is disabled 00:05:34.083 EAL: Heap on socket 0 was shrunk by 258MB 00:05:34.083 EAL: Trying to obtain current memory policy. 00:05:34.083 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:34.083 EAL: Restoring previous memory policy: 4 00:05:34.083 EAL: Calling mem event callback 'spdk:(nil)' 00:05:34.083 EAL: request: mp_malloc_sync 00:05:34.083 EAL: No shared files mode enabled, IPC is disabled 00:05:34.083 EAL: Heap on socket 0 was expanded by 514MB 00:05:34.341 EAL: Calling mem event callback 'spdk:(nil)' 00:05:34.341 EAL: request: mp_malloc_sync 00:05:34.341 EAL: No shared files mode enabled, IPC is disabled 00:05:34.341 EAL: Heap on socket 0 was shrunk by 514MB 00:05:34.341 EAL: Trying to obtain current memory policy. 00:05:34.341 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:34.601 EAL: Restoring previous memory policy: 4 00:05:34.601 EAL: Calling mem event callback 'spdk:(nil)' 00:05:34.601 EAL: request: mp_malloc_sync 00:05:34.601 EAL: No shared files mode enabled, IPC is disabled 00:05:34.601 EAL: Heap on socket 0 was expanded by 1026MB 00:05:34.601 EAL: Calling mem event callback 'spdk:(nil)' 00:05:34.859 EAL: request: mp_malloc_sync 00:05:34.859 EAL: No shared files mode enabled, IPC is disabled 00:05:34.859 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:34.859 passed 00:05:34.859 00:05:34.859 Run Summary: Type Total Ran Passed Failed Inactive 00:05:34.860 suites 1 1 n/a 0 0 00:05:34.860 tests 2 2 2 0 0 00:05:34.860 asserts 497 497 497 0 n/a 00:05:34.860 00:05:34.860 Elapsed time = 0.964 seconds 00:05:34.860 EAL: Calling mem event callback 'spdk:(nil)' 00:05:34.860 EAL: request: mp_malloc_sync 00:05:34.860 EAL: No shared files mode enabled, IPC is disabled 00:05:34.860 EAL: Heap on socket 0 was shrunk by 2MB 00:05:34.860 EAL: No shared files mode enabled, IPC is disabled 00:05:34.860 EAL: No shared files mode enabled, IPC is disabled 00:05:34.860 EAL: No shared files mode enabled, IPC is disabled 00:05:34.860 00:05:34.860 real 0m1.088s 00:05:34.860 user 0m0.625s 00:05:34.860 sys 0m0.429s 00:05:34.860 05:30:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:34.860 05:30:45 -- common/autotest_common.sh@10 -- # set +x 00:05:34.860 ************************************ 00:05:34.860 END TEST env_vtophys 00:05:34.860 ************************************ 00:05:34.860 05:30:46 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:34.860 05:30:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:34.860 05:30:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.860 05:30:46 -- common/autotest_common.sh@10 -- # set +x 00:05:34.860 ************************************ 00:05:34.860 START TEST env_pci 00:05:34.860 ************************************ 00:05:34.860 05:30:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:34.860 00:05:34.860 00:05:34.860 CUnit - A unit testing framework for C - Version 2.1-3 00:05:34.860 http://cunit.sourceforge.net/ 00:05:34.860 00:05:34.860 00:05:34.860 Suite: pci 00:05:34.860 Test: pci_hook ...[2024-11-29 05:30:46.062912] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2189444 has claimed it 00:05:34.860 EAL: Cannot find device (10000:00:01.0) 00:05:34.860 EAL: Failed to attach device on primary process 00:05:34.860 passed 00:05:34.860 00:05:34.860 Run Summary: Type Total Ran Passed Failed Inactive 00:05:34.860 suites 1 1 n/a 0 0 00:05:34.860 tests 1 1 1 0 0 00:05:34.860 asserts 25 25 25 0 n/a 00:05:34.860 00:05:34.860 Elapsed time = 0.034 seconds 00:05:34.860 00:05:34.860 real 0m0.052s 00:05:34.860 user 0m0.014s 00:05:34.860 sys 0m0.037s 00:05:34.860 05:30:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:34.860 05:30:46 -- common/autotest_common.sh@10 -- # set +x 00:05:34.860 ************************************ 00:05:34.860 END TEST env_pci 00:05:34.860 ************************************ 00:05:34.860 05:30:46 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:34.860 05:30:46 -- env/env.sh@15 -- # uname 00:05:34.860 05:30:46 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:34.860 05:30:46 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:34.860 05:30:46 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:34.860 05:30:46 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:34.860 05:30:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.860 05:30:46 -- common/autotest_common.sh@10 -- # set +x 00:05:34.860 ************************************ 00:05:34.860 START TEST env_dpdk_post_init 00:05:34.860 ************************************ 00:05:34.860 05:30:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:35.119 EAL: Detected CPU lcores: 112 00:05:35.119 EAL: Detected NUMA nodes: 2 00:05:35.119 EAL: Detected static linkage of DPDK 00:05:35.119 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:35.119 EAL: Selected IOVA mode 'VA' 00:05:35.119 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.119 EAL: VFIO support initialized 00:05:35.119 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:35.119 EAL: Using IOMMU type 1 (Type 1) 00:05:36.056 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:39.342 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:39.342 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:39.601 Starting DPDK initialization... 00:05:39.601 Starting SPDK post initialization... 00:05:39.601 SPDK NVMe probe 00:05:39.601 Attaching to 0000:d8:00.0 00:05:39.601 Attached to 0000:d8:00.0 00:05:39.601 Cleaning up... 00:05:39.601 00:05:39.601 real 0m4.716s 00:05:39.601 user 0m3.557s 00:05:39.601 sys 0m0.403s 00:05:39.601 05:30:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:39.601 05:30:50 -- common/autotest_common.sh@10 -- # set +x 00:05:39.601 ************************************ 00:05:39.601 END TEST env_dpdk_post_init 00:05:39.601 ************************************ 00:05:39.859 05:30:50 -- env/env.sh@26 -- # uname 00:05:39.859 05:30:50 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:39.859 05:30:50 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:39.859 05:30:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:39.859 05:30:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:39.859 05:30:50 -- common/autotest_common.sh@10 -- # set +x 00:05:39.859 ************************************ 00:05:39.859 START TEST env_mem_callbacks 00:05:39.859 ************************************ 00:05:39.859 05:30:50 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:39.859 EAL: Detected CPU lcores: 112 00:05:39.859 EAL: Detected NUMA nodes: 2 00:05:39.859 EAL: Detected static linkage of DPDK 00:05:39.859 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:39.859 EAL: Selected IOVA mode 'VA' 00:05:39.859 EAL: No free 2048 kB hugepages reported on node 1 00:05:39.859 EAL: VFIO support initialized 00:05:39.859 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:39.859 00:05:39.859 00:05:39.859 CUnit - A unit testing framework for C - Version 2.1-3 00:05:39.859 http://cunit.sourceforge.net/ 00:05:39.859 00:05:39.859 00:05:39.859 Suite: memory 00:05:39.859 Test: test ... 00:05:39.859 register 0x200000200000 2097152 00:05:39.859 malloc 3145728 00:05:39.859 register 0x200000400000 4194304 00:05:39.859 buf 0x200000500000 len 3145728 PASSED 00:05:39.859 malloc 64 00:05:39.859 buf 0x2000004fff40 len 64 PASSED 00:05:39.859 malloc 4194304 00:05:39.859 register 0x200000800000 6291456 00:05:39.859 buf 0x200000a00000 len 4194304 PASSED 00:05:39.859 free 0x200000500000 3145728 00:05:39.859 free 0x2000004fff40 64 00:05:39.859 unregister 0x200000400000 4194304 PASSED 00:05:39.859 free 0x200000a00000 4194304 00:05:39.859 unregister 0x200000800000 6291456 PASSED 00:05:39.859 malloc 8388608 00:05:39.859 register 0x200000400000 10485760 00:05:39.859 buf 0x200000600000 len 8388608 PASSED 00:05:39.859 free 0x200000600000 8388608 00:05:39.859 unregister 0x200000400000 10485760 PASSED 00:05:39.859 passed 00:05:39.859 00:05:39.859 Run Summary: Type Total Ran Passed Failed Inactive 00:05:39.859 suites 1 1 n/a 0 0 00:05:39.859 tests 1 1 1 0 0 00:05:39.859 asserts 15 15 15 0 n/a 00:05:39.859 00:05:39.859 Elapsed time = 0.005 seconds 00:05:39.859 00:05:39.859 real 0m0.062s 00:05:39.859 user 0m0.013s 00:05:39.859 sys 0m0.048s 00:05:39.859 05:30:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:39.859 05:30:50 -- common/autotest_common.sh@10 -- # set +x 00:05:39.859 ************************************ 00:05:39.859 END TEST env_mem_callbacks 00:05:39.859 ************************************ 00:05:39.859 00:05:39.859 real 0m6.457s 00:05:39.860 user 0m4.476s 00:05:39.860 sys 0m1.249s 00:05:39.860 05:30:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:39.860 05:30:51 -- common/autotest_common.sh@10 -- # set +x 00:05:39.860 ************************************ 00:05:39.860 END TEST env 00:05:39.860 ************************************ 00:05:39.860 05:30:51 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:39.860 05:30:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:39.860 05:30:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:39.860 05:30:51 -- common/autotest_common.sh@10 -- # set +x 00:05:39.860 ************************************ 00:05:39.860 START TEST rpc 00:05:39.860 ************************************ 00:05:39.860 05:30:51 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:40.151 * Looking for test storage... 00:05:40.151 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:40.151 05:30:51 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:40.151 05:30:51 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:40.151 05:30:51 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:40.151 05:30:51 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:40.151 05:30:51 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:40.151 05:30:51 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:40.151 05:30:51 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:40.151 05:30:51 -- scripts/common.sh@335 -- # IFS=.-: 00:05:40.151 05:30:51 -- scripts/common.sh@335 -- # read -ra ver1 00:05:40.151 05:30:51 -- scripts/common.sh@336 -- # IFS=.-: 00:05:40.151 05:30:51 -- scripts/common.sh@336 -- # read -ra ver2 00:05:40.151 05:30:51 -- scripts/common.sh@337 -- # local 'op=<' 00:05:40.151 05:30:51 -- scripts/common.sh@339 -- # ver1_l=2 00:05:40.151 05:30:51 -- scripts/common.sh@340 -- # ver2_l=1 00:05:40.151 05:30:51 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:40.151 05:30:51 -- scripts/common.sh@343 -- # case "$op" in 00:05:40.151 05:30:51 -- scripts/common.sh@344 -- # : 1 00:05:40.151 05:30:51 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:40.151 05:30:51 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:40.151 05:30:51 -- scripts/common.sh@364 -- # decimal 1 00:05:40.151 05:30:51 -- scripts/common.sh@352 -- # local d=1 00:05:40.151 05:30:51 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:40.151 05:30:51 -- scripts/common.sh@354 -- # echo 1 00:05:40.151 05:30:51 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:40.151 05:30:51 -- scripts/common.sh@365 -- # decimal 2 00:05:40.151 05:30:51 -- scripts/common.sh@352 -- # local d=2 00:05:40.151 05:30:51 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:40.151 05:30:51 -- scripts/common.sh@354 -- # echo 2 00:05:40.151 05:30:51 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:40.151 05:30:51 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:40.151 05:30:51 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:40.151 05:30:51 -- scripts/common.sh@367 -- # return 0 00:05:40.151 05:30:51 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:40.151 05:30:51 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:40.151 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.151 --rc genhtml_branch_coverage=1 00:05:40.151 --rc genhtml_function_coverage=1 00:05:40.151 --rc genhtml_legend=1 00:05:40.151 --rc geninfo_all_blocks=1 00:05:40.151 --rc geninfo_unexecuted_blocks=1 00:05:40.151 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.151 ' 00:05:40.151 05:30:51 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:40.151 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.151 --rc genhtml_branch_coverage=1 00:05:40.151 --rc genhtml_function_coverage=1 00:05:40.151 --rc genhtml_legend=1 00:05:40.151 --rc geninfo_all_blocks=1 00:05:40.151 --rc geninfo_unexecuted_blocks=1 00:05:40.151 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.151 ' 00:05:40.151 05:30:51 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:40.151 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.151 --rc genhtml_branch_coverage=1 00:05:40.151 --rc genhtml_function_coverage=1 00:05:40.151 --rc genhtml_legend=1 00:05:40.151 --rc geninfo_all_blocks=1 00:05:40.151 --rc geninfo_unexecuted_blocks=1 00:05:40.151 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.151 ' 00:05:40.151 05:30:51 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:40.151 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.151 --rc genhtml_branch_coverage=1 00:05:40.151 --rc genhtml_function_coverage=1 00:05:40.151 --rc genhtml_legend=1 00:05:40.151 --rc geninfo_all_blocks=1 00:05:40.151 --rc geninfo_unexecuted_blocks=1 00:05:40.151 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.151 ' 00:05:40.151 05:30:51 -- rpc/rpc.sh@65 -- # spdk_pid=2190427 00:05:40.151 05:30:51 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:40.151 05:30:51 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:40.151 05:30:51 -- rpc/rpc.sh@67 -- # waitforlisten 2190427 00:05:40.151 05:30:51 -- common/autotest_common.sh@829 -- # '[' -z 2190427 ']' 00:05:40.151 05:30:51 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:40.151 05:30:51 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:40.151 05:30:51 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:40.151 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:40.151 05:30:51 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:40.151 05:30:51 -- common/autotest_common.sh@10 -- # set +x 00:05:40.151 [2024-11-29 05:30:51.283333] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:40.151 [2024-11-29 05:30:51.283400] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2190427 ] 00:05:40.151 EAL: No free 2048 kB hugepages reported on node 1 00:05:40.151 [2024-11-29 05:30:51.348453] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:40.151 [2024-11-29 05:30:51.384394] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:40.151 [2024-11-29 05:30:51.384523] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:40.151 [2024-11-29 05:30:51.384534] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2190427' to capture a snapshot of events at runtime. 00:05:40.151 [2024-11-29 05:30:51.384543] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2190427 for offline analysis/debug. 00:05:40.151 [2024-11-29 05:30:51.384567] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.139 05:30:52 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:41.139 05:30:52 -- common/autotest_common.sh@862 -- # return 0 00:05:41.139 05:30:52 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:41.139 05:30:52 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:41.139 05:30:52 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:41.139 05:30:52 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:41.139 05:30:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:41.139 05:30:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:41.139 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.139 ************************************ 00:05:41.139 START TEST rpc_integrity 00:05:41.139 ************************************ 00:05:41.139 05:30:52 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:41.139 05:30:52 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:41.139 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.139 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.139 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.139 05:30:52 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:41.139 05:30:52 -- rpc/rpc.sh@13 -- # jq length 00:05:41.139 05:30:52 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:41.139 05:30:52 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:41.139 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.139 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.139 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.139 05:30:52 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:41.139 05:30:52 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:41.139 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.139 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.139 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.139 05:30:52 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:41.139 { 00:05:41.139 "name": "Malloc0", 00:05:41.139 "aliases": [ 00:05:41.139 "a028d7e9-c992-456c-92dd-5851a27ce7b5" 00:05:41.139 ], 00:05:41.139 "product_name": "Malloc disk", 00:05:41.139 "block_size": 512, 00:05:41.139 "num_blocks": 16384, 00:05:41.139 "uuid": "a028d7e9-c992-456c-92dd-5851a27ce7b5", 00:05:41.139 "assigned_rate_limits": { 00:05:41.139 "rw_ios_per_sec": 0, 00:05:41.139 "rw_mbytes_per_sec": 0, 00:05:41.139 "r_mbytes_per_sec": 0, 00:05:41.139 "w_mbytes_per_sec": 0 00:05:41.139 }, 00:05:41.139 "claimed": false, 00:05:41.139 "zoned": false, 00:05:41.139 "supported_io_types": { 00:05:41.139 "read": true, 00:05:41.139 "write": true, 00:05:41.139 "unmap": true, 00:05:41.139 "write_zeroes": true, 00:05:41.139 "flush": true, 00:05:41.139 "reset": true, 00:05:41.139 "compare": false, 00:05:41.139 "compare_and_write": false, 00:05:41.139 "abort": true, 00:05:41.139 "nvme_admin": false, 00:05:41.139 "nvme_io": false 00:05:41.139 }, 00:05:41.139 "memory_domains": [ 00:05:41.139 { 00:05:41.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:41.139 "dma_device_type": 2 00:05:41.139 } 00:05:41.139 ], 00:05:41.139 "driver_specific": {} 00:05:41.139 } 00:05:41.139 ]' 00:05:41.139 05:30:52 -- rpc/rpc.sh@17 -- # jq length 00:05:41.139 05:30:52 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:41.139 05:30:52 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:41.139 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.139 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.139 [2024-11-29 05:30:52.238000] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:41.139 [2024-11-29 05:30:52.238035] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:41.139 [2024-11-29 05:30:52.238056] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4989850 00:05:41.139 [2024-11-29 05:30:52.238067] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:41.139 [2024-11-29 05:30:52.238888] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:41.139 [2024-11-29 05:30:52.238912] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:41.139 Passthru0 00:05:41.139 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.139 05:30:52 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:41.139 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.139 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.139 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.139 05:30:52 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:41.139 { 00:05:41.139 "name": "Malloc0", 00:05:41.139 "aliases": [ 00:05:41.139 "a028d7e9-c992-456c-92dd-5851a27ce7b5" 00:05:41.139 ], 00:05:41.139 "product_name": "Malloc disk", 00:05:41.139 "block_size": 512, 00:05:41.139 "num_blocks": 16384, 00:05:41.139 "uuid": "a028d7e9-c992-456c-92dd-5851a27ce7b5", 00:05:41.139 "assigned_rate_limits": { 00:05:41.139 "rw_ios_per_sec": 0, 00:05:41.139 "rw_mbytes_per_sec": 0, 00:05:41.139 "r_mbytes_per_sec": 0, 00:05:41.139 "w_mbytes_per_sec": 0 00:05:41.139 }, 00:05:41.139 "claimed": true, 00:05:41.139 "claim_type": "exclusive_write", 00:05:41.139 "zoned": false, 00:05:41.139 "supported_io_types": { 00:05:41.139 "read": true, 00:05:41.139 "write": true, 00:05:41.139 "unmap": true, 00:05:41.139 "write_zeroes": true, 00:05:41.139 "flush": true, 00:05:41.139 "reset": true, 00:05:41.139 "compare": false, 00:05:41.139 "compare_and_write": false, 00:05:41.139 "abort": true, 00:05:41.139 "nvme_admin": false, 00:05:41.139 "nvme_io": false 00:05:41.139 }, 00:05:41.139 "memory_domains": [ 00:05:41.139 { 00:05:41.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:41.139 "dma_device_type": 2 00:05:41.139 } 00:05:41.139 ], 00:05:41.139 "driver_specific": {} 00:05:41.139 }, 00:05:41.140 { 00:05:41.140 "name": "Passthru0", 00:05:41.140 "aliases": [ 00:05:41.140 "3aeb27ca-b8d4-5185-8857-dad9abb909d6" 00:05:41.140 ], 00:05:41.140 "product_name": "passthru", 00:05:41.140 "block_size": 512, 00:05:41.140 "num_blocks": 16384, 00:05:41.140 "uuid": "3aeb27ca-b8d4-5185-8857-dad9abb909d6", 00:05:41.140 "assigned_rate_limits": { 00:05:41.140 "rw_ios_per_sec": 0, 00:05:41.140 "rw_mbytes_per_sec": 0, 00:05:41.140 "r_mbytes_per_sec": 0, 00:05:41.140 "w_mbytes_per_sec": 0 00:05:41.140 }, 00:05:41.140 "claimed": false, 00:05:41.140 "zoned": false, 00:05:41.140 "supported_io_types": { 00:05:41.140 "read": true, 00:05:41.140 "write": true, 00:05:41.140 "unmap": true, 00:05:41.140 "write_zeroes": true, 00:05:41.140 "flush": true, 00:05:41.140 "reset": true, 00:05:41.140 "compare": false, 00:05:41.140 "compare_and_write": false, 00:05:41.140 "abort": true, 00:05:41.140 "nvme_admin": false, 00:05:41.140 "nvme_io": false 00:05:41.140 }, 00:05:41.140 "memory_domains": [ 00:05:41.140 { 00:05:41.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:41.140 "dma_device_type": 2 00:05:41.140 } 00:05:41.140 ], 00:05:41.140 "driver_specific": { 00:05:41.140 "passthru": { 00:05:41.140 "name": "Passthru0", 00:05:41.140 "base_bdev_name": "Malloc0" 00:05:41.140 } 00:05:41.140 } 00:05:41.140 } 00:05:41.140 ]' 00:05:41.140 05:30:52 -- rpc/rpc.sh@21 -- # jq length 00:05:41.140 05:30:52 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:41.140 05:30:52 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:41.140 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.140 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.140 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.140 05:30:52 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:41.140 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.140 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.140 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.140 05:30:52 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:41.140 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.140 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.140 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.140 05:30:52 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:41.140 05:30:52 -- rpc/rpc.sh@26 -- # jq length 00:05:41.140 05:30:52 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:41.140 00:05:41.140 real 0m0.235s 00:05:41.140 user 0m0.125s 00:05:41.140 sys 0m0.045s 00:05:41.140 05:30:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:41.140 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.140 ************************************ 00:05:41.140 END TEST rpc_integrity 00:05:41.140 ************************************ 00:05:41.140 05:30:52 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:41.140 05:30:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:41.140 05:30:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:41.140 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.140 ************************************ 00:05:41.140 START TEST rpc_plugins 00:05:41.140 ************************************ 00:05:41.140 05:30:52 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:05:41.140 05:30:52 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:41.140 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.140 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.140 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.140 05:30:52 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:41.140 05:30:52 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:41.140 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.140 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.140 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.140 05:30:52 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:41.140 { 00:05:41.140 "name": "Malloc1", 00:05:41.140 "aliases": [ 00:05:41.140 "8a161764-f255-4170-a768-20e13f8b4dd8" 00:05:41.140 ], 00:05:41.140 "product_name": "Malloc disk", 00:05:41.140 "block_size": 4096, 00:05:41.140 "num_blocks": 256, 00:05:41.140 "uuid": "8a161764-f255-4170-a768-20e13f8b4dd8", 00:05:41.140 "assigned_rate_limits": { 00:05:41.140 "rw_ios_per_sec": 0, 00:05:41.140 "rw_mbytes_per_sec": 0, 00:05:41.140 "r_mbytes_per_sec": 0, 00:05:41.140 "w_mbytes_per_sec": 0 00:05:41.140 }, 00:05:41.140 "claimed": false, 00:05:41.140 "zoned": false, 00:05:41.140 "supported_io_types": { 00:05:41.140 "read": true, 00:05:41.140 "write": true, 00:05:41.140 "unmap": true, 00:05:41.140 "write_zeroes": true, 00:05:41.140 "flush": true, 00:05:41.140 "reset": true, 00:05:41.140 "compare": false, 00:05:41.140 "compare_and_write": false, 00:05:41.140 "abort": true, 00:05:41.140 "nvme_admin": false, 00:05:41.140 "nvme_io": false 00:05:41.140 }, 00:05:41.140 "memory_domains": [ 00:05:41.140 { 00:05:41.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:41.140 "dma_device_type": 2 00:05:41.140 } 00:05:41.140 ], 00:05:41.140 "driver_specific": {} 00:05:41.140 } 00:05:41.140 ]' 00:05:41.140 05:30:52 -- rpc/rpc.sh@32 -- # jq length 00:05:41.399 05:30:52 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:41.399 05:30:52 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:41.399 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.399 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.399 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.399 05:30:52 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:41.399 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.399 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.399 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.399 05:30:52 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:41.399 05:30:52 -- rpc/rpc.sh@36 -- # jq length 00:05:41.399 05:30:52 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:41.399 00:05:41.399 real 0m0.127s 00:05:41.399 user 0m0.077s 00:05:41.399 sys 0m0.019s 00:05:41.399 05:30:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:41.399 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.399 ************************************ 00:05:41.399 END TEST rpc_plugins 00:05:41.399 ************************************ 00:05:41.399 05:30:52 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:41.399 05:30:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:41.399 05:30:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:41.399 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.399 ************************************ 00:05:41.399 START TEST rpc_trace_cmd_test 00:05:41.399 ************************************ 00:05:41.399 05:30:52 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:05:41.399 05:30:52 -- rpc/rpc.sh@40 -- # local info 00:05:41.399 05:30:52 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:41.399 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.399 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.399 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.399 05:30:52 -- rpc/rpc.sh@42 -- # info='{ 00:05:41.399 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2190427", 00:05:41.399 "tpoint_group_mask": "0x8", 00:05:41.399 "iscsi_conn": { 00:05:41.399 "mask": "0x2", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 }, 00:05:41.399 "scsi": { 00:05:41.399 "mask": "0x4", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 }, 00:05:41.399 "bdev": { 00:05:41.399 "mask": "0x8", 00:05:41.399 "tpoint_mask": "0xffffffffffffffff" 00:05:41.399 }, 00:05:41.399 "nvmf_rdma": { 00:05:41.399 "mask": "0x10", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 }, 00:05:41.399 "nvmf_tcp": { 00:05:41.399 "mask": "0x20", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 }, 00:05:41.399 "ftl": { 00:05:41.399 "mask": "0x40", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 }, 00:05:41.399 "blobfs": { 00:05:41.399 "mask": "0x80", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 }, 00:05:41.399 "dsa": { 00:05:41.399 "mask": "0x200", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 }, 00:05:41.399 "thread": { 00:05:41.399 "mask": "0x400", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 }, 00:05:41.399 "nvme_pcie": { 00:05:41.399 "mask": "0x800", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 }, 00:05:41.399 "iaa": { 00:05:41.399 "mask": "0x1000", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 }, 00:05:41.399 "nvme_tcp": { 00:05:41.399 "mask": "0x2000", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 }, 00:05:41.399 "bdev_nvme": { 00:05:41.399 "mask": "0x4000", 00:05:41.399 "tpoint_mask": "0x0" 00:05:41.399 } 00:05:41.399 }' 00:05:41.399 05:30:52 -- rpc/rpc.sh@43 -- # jq length 00:05:41.399 05:30:52 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:41.399 05:30:52 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:41.399 05:30:52 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:41.399 05:30:52 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:41.658 05:30:52 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:41.658 05:30:52 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:41.658 05:30:52 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:41.658 05:30:52 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:41.658 05:30:52 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:41.658 00:05:41.658 real 0m0.205s 00:05:41.658 user 0m0.155s 00:05:41.658 sys 0m0.043s 00:05:41.658 05:30:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:41.658 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.658 ************************************ 00:05:41.658 END TEST rpc_trace_cmd_test 00:05:41.658 ************************************ 00:05:41.658 05:30:52 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:41.658 05:30:52 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:41.658 05:30:52 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:41.658 05:30:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:41.658 05:30:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:41.658 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.658 ************************************ 00:05:41.658 START TEST rpc_daemon_integrity 00:05:41.658 ************************************ 00:05:41.658 05:30:52 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:41.658 05:30:52 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:41.658 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.658 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.658 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.658 05:30:52 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:41.658 05:30:52 -- rpc/rpc.sh@13 -- # jq length 00:05:41.658 05:30:52 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:41.658 05:30:52 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:41.658 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.658 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.658 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.658 05:30:52 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:41.658 05:30:52 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:41.658 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.658 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.658 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.658 05:30:52 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:41.658 { 00:05:41.658 "name": "Malloc2", 00:05:41.658 "aliases": [ 00:05:41.658 "60c10f44-c31c-47a0-ae9e-48555c4746d9" 00:05:41.658 ], 00:05:41.658 "product_name": "Malloc disk", 00:05:41.658 "block_size": 512, 00:05:41.658 "num_blocks": 16384, 00:05:41.658 "uuid": "60c10f44-c31c-47a0-ae9e-48555c4746d9", 00:05:41.658 "assigned_rate_limits": { 00:05:41.658 "rw_ios_per_sec": 0, 00:05:41.658 "rw_mbytes_per_sec": 0, 00:05:41.658 "r_mbytes_per_sec": 0, 00:05:41.658 "w_mbytes_per_sec": 0 00:05:41.658 }, 00:05:41.658 "claimed": false, 00:05:41.658 "zoned": false, 00:05:41.658 "supported_io_types": { 00:05:41.658 "read": true, 00:05:41.658 "write": true, 00:05:41.658 "unmap": true, 00:05:41.658 "write_zeroes": true, 00:05:41.658 "flush": true, 00:05:41.658 "reset": true, 00:05:41.658 "compare": false, 00:05:41.658 "compare_and_write": false, 00:05:41.658 "abort": true, 00:05:41.658 "nvme_admin": false, 00:05:41.658 "nvme_io": false 00:05:41.658 }, 00:05:41.658 "memory_domains": [ 00:05:41.658 { 00:05:41.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:41.658 "dma_device_type": 2 00:05:41.658 } 00:05:41.658 ], 00:05:41.658 "driver_specific": {} 00:05:41.658 } 00:05:41.658 ]' 00:05:41.658 05:30:52 -- rpc/rpc.sh@17 -- # jq length 00:05:41.658 05:30:52 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:41.658 05:30:52 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:41.658 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.658 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.658 [2024-11-29 05:30:52.947853] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:41.658 [2024-11-29 05:30:52.947887] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:41.658 [2024-11-29 05:30:52.947904] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x498b4c0 00:05:41.658 [2024-11-29 05:30:52.947914] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:41.658 [2024-11-29 05:30:52.948596] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:41.658 [2024-11-29 05:30:52.948626] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:41.658 Passthru0 00:05:41.658 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.658 05:30:52 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:41.658 05:30:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.658 05:30:52 -- common/autotest_common.sh@10 -- # set +x 00:05:41.918 05:30:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.918 05:30:52 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:41.918 { 00:05:41.918 "name": "Malloc2", 00:05:41.918 "aliases": [ 00:05:41.918 "60c10f44-c31c-47a0-ae9e-48555c4746d9" 00:05:41.918 ], 00:05:41.918 "product_name": "Malloc disk", 00:05:41.918 "block_size": 512, 00:05:41.918 "num_blocks": 16384, 00:05:41.918 "uuid": "60c10f44-c31c-47a0-ae9e-48555c4746d9", 00:05:41.918 "assigned_rate_limits": { 00:05:41.918 "rw_ios_per_sec": 0, 00:05:41.918 "rw_mbytes_per_sec": 0, 00:05:41.918 "r_mbytes_per_sec": 0, 00:05:41.918 "w_mbytes_per_sec": 0 00:05:41.918 }, 00:05:41.918 "claimed": true, 00:05:41.918 "claim_type": "exclusive_write", 00:05:41.918 "zoned": false, 00:05:41.918 "supported_io_types": { 00:05:41.918 "read": true, 00:05:41.918 "write": true, 00:05:41.918 "unmap": true, 00:05:41.918 "write_zeroes": true, 00:05:41.918 "flush": true, 00:05:41.918 "reset": true, 00:05:41.918 "compare": false, 00:05:41.918 "compare_and_write": false, 00:05:41.918 "abort": true, 00:05:41.918 "nvme_admin": false, 00:05:41.918 "nvme_io": false 00:05:41.918 }, 00:05:41.918 "memory_domains": [ 00:05:41.918 { 00:05:41.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:41.918 "dma_device_type": 2 00:05:41.918 } 00:05:41.918 ], 00:05:41.918 "driver_specific": {} 00:05:41.918 }, 00:05:41.918 { 00:05:41.918 "name": "Passthru0", 00:05:41.918 "aliases": [ 00:05:41.918 "5b784be8-f4be-5b7d-855e-368dbce2a0a3" 00:05:41.918 ], 00:05:41.918 "product_name": "passthru", 00:05:41.918 "block_size": 512, 00:05:41.918 "num_blocks": 16384, 00:05:41.918 "uuid": "5b784be8-f4be-5b7d-855e-368dbce2a0a3", 00:05:41.918 "assigned_rate_limits": { 00:05:41.918 "rw_ios_per_sec": 0, 00:05:41.918 "rw_mbytes_per_sec": 0, 00:05:41.918 "r_mbytes_per_sec": 0, 00:05:41.918 "w_mbytes_per_sec": 0 00:05:41.918 }, 00:05:41.918 "claimed": false, 00:05:41.918 "zoned": false, 00:05:41.918 "supported_io_types": { 00:05:41.918 "read": true, 00:05:41.918 "write": true, 00:05:41.918 "unmap": true, 00:05:41.918 "write_zeroes": true, 00:05:41.918 "flush": true, 00:05:41.918 "reset": true, 00:05:41.918 "compare": false, 00:05:41.918 "compare_and_write": false, 00:05:41.918 "abort": true, 00:05:41.918 "nvme_admin": false, 00:05:41.918 "nvme_io": false 00:05:41.918 }, 00:05:41.918 "memory_domains": [ 00:05:41.918 { 00:05:41.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:41.918 "dma_device_type": 2 00:05:41.918 } 00:05:41.918 ], 00:05:41.918 "driver_specific": { 00:05:41.918 "passthru": { 00:05:41.918 "name": "Passthru0", 00:05:41.918 "base_bdev_name": "Malloc2" 00:05:41.918 } 00:05:41.918 } 00:05:41.918 } 00:05:41.918 ]' 00:05:41.918 05:30:52 -- rpc/rpc.sh@21 -- # jq length 00:05:41.918 05:30:53 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:41.918 05:30:53 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:41.918 05:30:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.918 05:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:41.918 05:30:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.918 05:30:53 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:41.918 05:30:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.918 05:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:41.918 05:30:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.918 05:30:53 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:41.918 05:30:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.918 05:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:41.918 05:30:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.918 05:30:53 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:41.918 05:30:53 -- rpc/rpc.sh@26 -- # jq length 00:05:41.918 05:30:53 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:41.918 00:05:41.918 real 0m0.245s 00:05:41.918 user 0m0.138s 00:05:41.918 sys 0m0.049s 00:05:41.918 05:30:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:41.918 05:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:41.918 ************************************ 00:05:41.918 END TEST rpc_daemon_integrity 00:05:41.918 ************************************ 00:05:41.918 05:30:53 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:41.918 05:30:53 -- rpc/rpc.sh@84 -- # killprocess 2190427 00:05:41.918 05:30:53 -- common/autotest_common.sh@936 -- # '[' -z 2190427 ']' 00:05:41.918 05:30:53 -- common/autotest_common.sh@940 -- # kill -0 2190427 00:05:41.918 05:30:53 -- common/autotest_common.sh@941 -- # uname 00:05:41.918 05:30:53 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:41.918 05:30:53 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2190427 00:05:41.918 05:30:53 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:41.918 05:30:53 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:41.918 05:30:53 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2190427' 00:05:41.918 killing process with pid 2190427 00:05:41.918 05:30:53 -- common/autotest_common.sh@955 -- # kill 2190427 00:05:41.918 05:30:53 -- common/autotest_common.sh@960 -- # wait 2190427 00:05:42.177 00:05:42.177 real 0m2.396s 00:05:42.177 user 0m2.914s 00:05:42.177 sys 0m0.761s 00:05:42.177 05:30:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:42.177 05:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:42.177 ************************************ 00:05:42.177 END TEST rpc 00:05:42.177 ************************************ 00:05:42.436 05:30:53 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:42.436 05:30:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.436 05:30:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.436 05:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:42.436 ************************************ 00:05:42.436 START TEST rpc_client 00:05:42.436 ************************************ 00:05:42.436 05:30:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:42.436 * Looking for test storage... 00:05:42.436 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:42.436 05:30:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:42.436 05:30:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:42.436 05:30:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:42.436 05:30:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:42.436 05:30:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:42.436 05:30:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:42.436 05:30:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:42.436 05:30:53 -- scripts/common.sh@335 -- # IFS=.-: 00:05:42.436 05:30:53 -- scripts/common.sh@335 -- # read -ra ver1 00:05:42.436 05:30:53 -- scripts/common.sh@336 -- # IFS=.-: 00:05:42.436 05:30:53 -- scripts/common.sh@336 -- # read -ra ver2 00:05:42.436 05:30:53 -- scripts/common.sh@337 -- # local 'op=<' 00:05:42.436 05:30:53 -- scripts/common.sh@339 -- # ver1_l=2 00:05:42.436 05:30:53 -- scripts/common.sh@340 -- # ver2_l=1 00:05:42.436 05:30:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:42.436 05:30:53 -- scripts/common.sh@343 -- # case "$op" in 00:05:42.436 05:30:53 -- scripts/common.sh@344 -- # : 1 00:05:42.436 05:30:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:42.436 05:30:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:42.436 05:30:53 -- scripts/common.sh@364 -- # decimal 1 00:05:42.436 05:30:53 -- scripts/common.sh@352 -- # local d=1 00:05:42.436 05:30:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:42.436 05:30:53 -- scripts/common.sh@354 -- # echo 1 00:05:42.436 05:30:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:42.436 05:30:53 -- scripts/common.sh@365 -- # decimal 2 00:05:42.436 05:30:53 -- scripts/common.sh@352 -- # local d=2 00:05:42.436 05:30:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:42.436 05:30:53 -- scripts/common.sh@354 -- # echo 2 00:05:42.436 05:30:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:42.436 05:30:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:42.436 05:30:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:42.436 05:30:53 -- scripts/common.sh@367 -- # return 0 00:05:42.436 05:30:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:42.436 05:30:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:42.436 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.436 --rc genhtml_branch_coverage=1 00:05:42.436 --rc genhtml_function_coverage=1 00:05:42.436 --rc genhtml_legend=1 00:05:42.436 --rc geninfo_all_blocks=1 00:05:42.436 --rc geninfo_unexecuted_blocks=1 00:05:42.436 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.436 ' 00:05:42.436 05:30:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:42.436 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.436 --rc genhtml_branch_coverage=1 00:05:42.436 --rc genhtml_function_coverage=1 00:05:42.436 --rc genhtml_legend=1 00:05:42.436 --rc geninfo_all_blocks=1 00:05:42.436 --rc geninfo_unexecuted_blocks=1 00:05:42.436 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.436 ' 00:05:42.436 05:30:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:42.436 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.436 --rc genhtml_branch_coverage=1 00:05:42.436 --rc genhtml_function_coverage=1 00:05:42.436 --rc genhtml_legend=1 00:05:42.436 --rc geninfo_all_blocks=1 00:05:42.436 --rc geninfo_unexecuted_blocks=1 00:05:42.436 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.436 ' 00:05:42.436 05:30:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:42.436 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.436 --rc genhtml_branch_coverage=1 00:05:42.437 --rc genhtml_function_coverage=1 00:05:42.437 --rc genhtml_legend=1 00:05:42.437 --rc geninfo_all_blocks=1 00:05:42.437 --rc geninfo_unexecuted_blocks=1 00:05:42.437 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.437 ' 00:05:42.437 05:30:53 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:42.437 OK 00:05:42.437 05:30:53 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:42.437 00:05:42.437 real 0m0.203s 00:05:42.437 user 0m0.114s 00:05:42.437 sys 0m0.105s 00:05:42.437 05:30:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:42.437 05:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:42.437 ************************************ 00:05:42.437 END TEST rpc_client 00:05:42.437 ************************************ 00:05:42.697 05:30:53 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:42.697 05:30:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.697 05:30:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.697 05:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:42.697 ************************************ 00:05:42.697 START TEST json_config 00:05:42.697 ************************************ 00:05:42.697 05:30:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:42.697 05:30:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:42.697 05:30:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:42.697 05:30:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:42.697 05:30:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:42.697 05:30:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:42.697 05:30:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:42.697 05:30:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:42.697 05:30:53 -- scripts/common.sh@335 -- # IFS=.-: 00:05:42.697 05:30:53 -- scripts/common.sh@335 -- # read -ra ver1 00:05:42.697 05:30:53 -- scripts/common.sh@336 -- # IFS=.-: 00:05:42.697 05:30:53 -- scripts/common.sh@336 -- # read -ra ver2 00:05:42.697 05:30:53 -- scripts/common.sh@337 -- # local 'op=<' 00:05:42.697 05:30:53 -- scripts/common.sh@339 -- # ver1_l=2 00:05:42.697 05:30:53 -- scripts/common.sh@340 -- # ver2_l=1 00:05:42.697 05:30:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:42.697 05:30:53 -- scripts/common.sh@343 -- # case "$op" in 00:05:42.697 05:30:53 -- scripts/common.sh@344 -- # : 1 00:05:42.697 05:30:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:42.697 05:30:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:42.697 05:30:53 -- scripts/common.sh@364 -- # decimal 1 00:05:42.697 05:30:53 -- scripts/common.sh@352 -- # local d=1 00:05:42.697 05:30:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:42.697 05:30:53 -- scripts/common.sh@354 -- # echo 1 00:05:42.697 05:30:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:42.697 05:30:53 -- scripts/common.sh@365 -- # decimal 2 00:05:42.697 05:30:53 -- scripts/common.sh@352 -- # local d=2 00:05:42.697 05:30:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:42.697 05:30:53 -- scripts/common.sh@354 -- # echo 2 00:05:42.697 05:30:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:42.697 05:30:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:42.697 05:30:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:42.697 05:30:53 -- scripts/common.sh@367 -- # return 0 00:05:42.697 05:30:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:42.697 05:30:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:42.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.697 --rc genhtml_branch_coverage=1 00:05:42.697 --rc genhtml_function_coverage=1 00:05:42.697 --rc genhtml_legend=1 00:05:42.697 --rc geninfo_all_blocks=1 00:05:42.697 --rc geninfo_unexecuted_blocks=1 00:05:42.697 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.697 ' 00:05:42.697 05:30:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:42.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.697 --rc genhtml_branch_coverage=1 00:05:42.697 --rc genhtml_function_coverage=1 00:05:42.697 --rc genhtml_legend=1 00:05:42.697 --rc geninfo_all_blocks=1 00:05:42.697 --rc geninfo_unexecuted_blocks=1 00:05:42.697 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.697 ' 00:05:42.697 05:30:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:42.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.697 --rc genhtml_branch_coverage=1 00:05:42.697 --rc genhtml_function_coverage=1 00:05:42.697 --rc genhtml_legend=1 00:05:42.697 --rc geninfo_all_blocks=1 00:05:42.697 --rc geninfo_unexecuted_blocks=1 00:05:42.697 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.697 ' 00:05:42.697 05:30:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:42.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.697 --rc genhtml_branch_coverage=1 00:05:42.697 --rc genhtml_function_coverage=1 00:05:42.697 --rc genhtml_legend=1 00:05:42.697 --rc geninfo_all_blocks=1 00:05:42.697 --rc geninfo_unexecuted_blocks=1 00:05:42.697 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.697 ' 00:05:42.697 05:30:53 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:42.697 05:30:53 -- nvmf/common.sh@7 -- # uname -s 00:05:42.697 05:30:53 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:42.697 05:30:53 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:42.697 05:30:53 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:42.697 05:30:53 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:42.698 05:30:53 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:42.698 05:30:53 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:42.698 05:30:53 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:42.698 05:30:53 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:42.698 05:30:53 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:42.698 05:30:53 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:42.698 05:30:53 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:42.698 05:30:53 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:42.698 05:30:53 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:42.698 05:30:53 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:42.698 05:30:53 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:42.698 05:30:53 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:42.698 05:30:53 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:42.698 05:30:53 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:42.698 05:30:53 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:42.698 05:30:53 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:42.698 05:30:53 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:42.698 05:30:53 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:42.698 05:30:53 -- paths/export.sh@5 -- # export PATH 00:05:42.698 05:30:53 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:42.698 05:30:53 -- nvmf/common.sh@46 -- # : 0 00:05:42.698 05:30:53 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:42.698 05:30:53 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:42.698 05:30:53 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:42.698 05:30:53 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:42.698 05:30:53 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:42.698 05:30:53 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:42.698 05:30:53 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:42.698 05:30:53 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:42.698 05:30:53 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:42.698 05:30:53 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:42.698 05:30:53 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:42.698 05:30:53 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:42.698 05:30:53 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:42.698 WARNING: No tests are enabled so not running JSON configuration tests 00:05:42.698 05:30:53 -- json_config/json_config.sh@27 -- # exit 0 00:05:42.698 00:05:42.698 real 0m0.190s 00:05:42.698 user 0m0.100s 00:05:42.698 sys 0m0.099s 00:05:42.698 05:30:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:42.698 05:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:42.698 ************************************ 00:05:42.698 END TEST json_config 00:05:42.698 ************************************ 00:05:42.958 05:30:54 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:42.958 05:30:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.958 05:30:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.958 05:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:42.958 ************************************ 00:05:42.958 START TEST json_config_extra_key 00:05:42.959 ************************************ 00:05:42.959 05:30:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:42.959 05:30:54 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:42.959 05:30:54 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:42.959 05:30:54 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:42.959 05:30:54 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:42.959 05:30:54 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:42.959 05:30:54 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:42.959 05:30:54 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:42.959 05:30:54 -- scripts/common.sh@335 -- # IFS=.-: 00:05:42.959 05:30:54 -- scripts/common.sh@335 -- # read -ra ver1 00:05:42.959 05:30:54 -- scripts/common.sh@336 -- # IFS=.-: 00:05:42.959 05:30:54 -- scripts/common.sh@336 -- # read -ra ver2 00:05:42.959 05:30:54 -- scripts/common.sh@337 -- # local 'op=<' 00:05:42.959 05:30:54 -- scripts/common.sh@339 -- # ver1_l=2 00:05:42.959 05:30:54 -- scripts/common.sh@340 -- # ver2_l=1 00:05:42.959 05:30:54 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:42.959 05:30:54 -- scripts/common.sh@343 -- # case "$op" in 00:05:42.959 05:30:54 -- scripts/common.sh@344 -- # : 1 00:05:42.959 05:30:54 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:42.959 05:30:54 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:42.959 05:30:54 -- scripts/common.sh@364 -- # decimal 1 00:05:42.959 05:30:54 -- scripts/common.sh@352 -- # local d=1 00:05:42.959 05:30:54 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:42.959 05:30:54 -- scripts/common.sh@354 -- # echo 1 00:05:42.959 05:30:54 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:42.959 05:30:54 -- scripts/common.sh@365 -- # decimal 2 00:05:42.959 05:30:54 -- scripts/common.sh@352 -- # local d=2 00:05:42.959 05:30:54 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:42.959 05:30:54 -- scripts/common.sh@354 -- # echo 2 00:05:42.959 05:30:54 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:42.959 05:30:54 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:42.959 05:30:54 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:42.959 05:30:54 -- scripts/common.sh@367 -- # return 0 00:05:42.959 05:30:54 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:42.959 05:30:54 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:42.959 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.959 --rc genhtml_branch_coverage=1 00:05:42.959 --rc genhtml_function_coverage=1 00:05:42.959 --rc genhtml_legend=1 00:05:42.959 --rc geninfo_all_blocks=1 00:05:42.959 --rc geninfo_unexecuted_blocks=1 00:05:42.959 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.959 ' 00:05:42.959 05:30:54 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:42.959 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.959 --rc genhtml_branch_coverage=1 00:05:42.959 --rc genhtml_function_coverage=1 00:05:42.959 --rc genhtml_legend=1 00:05:42.959 --rc geninfo_all_blocks=1 00:05:42.959 --rc geninfo_unexecuted_blocks=1 00:05:42.959 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.959 ' 00:05:42.959 05:30:54 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:42.959 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.959 --rc genhtml_branch_coverage=1 00:05:42.959 --rc genhtml_function_coverage=1 00:05:42.959 --rc genhtml_legend=1 00:05:42.959 --rc geninfo_all_blocks=1 00:05:42.959 --rc geninfo_unexecuted_blocks=1 00:05:42.959 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.959 ' 00:05:42.959 05:30:54 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:42.959 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.959 --rc genhtml_branch_coverage=1 00:05:42.959 --rc genhtml_function_coverage=1 00:05:42.959 --rc genhtml_legend=1 00:05:42.959 --rc geninfo_all_blocks=1 00:05:42.959 --rc geninfo_unexecuted_blocks=1 00:05:42.959 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.959 ' 00:05:42.959 05:30:54 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:42.959 05:30:54 -- nvmf/common.sh@7 -- # uname -s 00:05:42.959 05:30:54 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:42.959 05:30:54 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:42.959 05:30:54 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:42.959 05:30:54 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:42.959 05:30:54 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:42.959 05:30:54 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:42.959 05:30:54 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:42.959 05:30:54 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:42.959 05:30:54 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:42.959 05:30:54 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:42.959 05:30:54 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:42.959 05:30:54 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:42.959 05:30:54 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:42.959 05:30:54 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:42.959 05:30:54 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:42.959 05:30:54 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:42.959 05:30:54 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:42.959 05:30:54 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:42.959 05:30:54 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:42.959 05:30:54 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:42.959 05:30:54 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:42.959 05:30:54 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:42.959 05:30:54 -- paths/export.sh@5 -- # export PATH 00:05:42.959 05:30:54 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:42.959 05:30:54 -- nvmf/common.sh@46 -- # : 0 00:05:42.959 05:30:54 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:42.959 05:30:54 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:42.959 05:30:54 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:42.959 05:30:54 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:42.959 05:30:54 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:42.959 05:30:54 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:42.959 05:30:54 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:42.959 05:30:54 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:42.959 05:30:54 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:42.959 05:30:54 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:42.960 INFO: launching applications... 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=2191225 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:42.960 Waiting for target to run... 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 2191225 /var/tmp/spdk_tgt.sock 00:05:42.960 05:30:54 -- common/autotest_common.sh@829 -- # '[' -z 2191225 ']' 00:05:42.960 05:30:54 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:42.960 05:30:54 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:42.960 05:30:54 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:42.960 05:30:54 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:42.960 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:42.960 05:30:54 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:42.960 05:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:42.960 [2024-11-29 05:30:54.232091] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:42.960 [2024-11-29 05:30:54.232179] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2191225 ] 00:05:43.219 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.477 [2024-11-29 05:30:54.658952] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.477 [2024-11-29 05:30:54.685756] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:43.477 [2024-11-29 05:30:54.685872] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.043 05:30:55 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:44.043 05:30:55 -- common/autotest_common.sh@862 -- # return 0 00:05:44.043 05:30:55 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:44.043 00:05:44.043 05:30:55 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:44.043 INFO: shutting down applications... 00:05:44.043 05:30:55 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:44.043 05:30:55 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:44.043 05:30:55 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:44.043 05:30:55 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 2191225 ]] 00:05:44.043 05:30:55 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 2191225 00:05:44.043 05:30:55 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:44.043 05:30:55 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:44.043 05:30:55 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2191225 00:05:44.043 05:30:55 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:44.301 05:30:55 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:44.301 05:30:55 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:44.301 05:30:55 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2191225 00:05:44.301 05:30:55 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:44.301 05:30:55 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:44.301 05:30:55 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:44.301 05:30:55 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:44.301 SPDK target shutdown done 00:05:44.301 05:30:55 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:44.301 Success 00:05:44.301 00:05:44.301 real 0m1.559s 00:05:44.301 user 0m1.142s 00:05:44.301 sys 0m0.570s 00:05:44.301 05:30:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:44.301 05:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.301 ************************************ 00:05:44.301 END TEST json_config_extra_key 00:05:44.301 ************************************ 00:05:44.560 05:30:55 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:44.560 05:30:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:44.560 05:30:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:44.560 05:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.560 ************************************ 00:05:44.560 START TEST alias_rpc 00:05:44.560 ************************************ 00:05:44.560 05:30:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:44.560 * Looking for test storage... 00:05:44.560 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:44.560 05:30:55 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:44.560 05:30:55 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:44.560 05:30:55 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:44.560 05:30:55 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:44.560 05:30:55 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:44.560 05:30:55 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:44.560 05:30:55 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:44.560 05:30:55 -- scripts/common.sh@335 -- # IFS=.-: 00:05:44.560 05:30:55 -- scripts/common.sh@335 -- # read -ra ver1 00:05:44.560 05:30:55 -- scripts/common.sh@336 -- # IFS=.-: 00:05:44.560 05:30:55 -- scripts/common.sh@336 -- # read -ra ver2 00:05:44.560 05:30:55 -- scripts/common.sh@337 -- # local 'op=<' 00:05:44.560 05:30:55 -- scripts/common.sh@339 -- # ver1_l=2 00:05:44.560 05:30:55 -- scripts/common.sh@340 -- # ver2_l=1 00:05:44.560 05:30:55 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:44.560 05:30:55 -- scripts/common.sh@343 -- # case "$op" in 00:05:44.560 05:30:55 -- scripts/common.sh@344 -- # : 1 00:05:44.560 05:30:55 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:44.560 05:30:55 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:44.560 05:30:55 -- scripts/common.sh@364 -- # decimal 1 00:05:44.560 05:30:55 -- scripts/common.sh@352 -- # local d=1 00:05:44.560 05:30:55 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:44.560 05:30:55 -- scripts/common.sh@354 -- # echo 1 00:05:44.560 05:30:55 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:44.560 05:30:55 -- scripts/common.sh@365 -- # decimal 2 00:05:44.560 05:30:55 -- scripts/common.sh@352 -- # local d=2 00:05:44.560 05:30:55 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:44.560 05:30:55 -- scripts/common.sh@354 -- # echo 2 00:05:44.560 05:30:55 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:44.560 05:30:55 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:44.560 05:30:55 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:44.560 05:30:55 -- scripts/common.sh@367 -- # return 0 00:05:44.560 05:30:55 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:44.560 05:30:55 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:44.560 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.560 --rc genhtml_branch_coverage=1 00:05:44.560 --rc genhtml_function_coverage=1 00:05:44.560 --rc genhtml_legend=1 00:05:44.560 --rc geninfo_all_blocks=1 00:05:44.560 --rc geninfo_unexecuted_blocks=1 00:05:44.560 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:44.560 ' 00:05:44.560 05:30:55 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:44.560 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.560 --rc genhtml_branch_coverage=1 00:05:44.560 --rc genhtml_function_coverage=1 00:05:44.560 --rc genhtml_legend=1 00:05:44.560 --rc geninfo_all_blocks=1 00:05:44.560 --rc geninfo_unexecuted_blocks=1 00:05:44.560 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:44.560 ' 00:05:44.560 05:30:55 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:44.560 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.560 --rc genhtml_branch_coverage=1 00:05:44.560 --rc genhtml_function_coverage=1 00:05:44.560 --rc genhtml_legend=1 00:05:44.560 --rc geninfo_all_blocks=1 00:05:44.560 --rc geninfo_unexecuted_blocks=1 00:05:44.560 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:44.560 ' 00:05:44.560 05:30:55 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:44.561 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.561 --rc genhtml_branch_coverage=1 00:05:44.561 --rc genhtml_function_coverage=1 00:05:44.561 --rc genhtml_legend=1 00:05:44.561 --rc geninfo_all_blocks=1 00:05:44.561 --rc geninfo_unexecuted_blocks=1 00:05:44.561 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:44.561 ' 00:05:44.561 05:30:55 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:44.561 05:30:55 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2191558 00:05:44.561 05:30:55 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2191558 00:05:44.561 05:30:55 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:44.561 05:30:55 -- common/autotest_common.sh@829 -- # '[' -z 2191558 ']' 00:05:44.561 05:30:55 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:44.561 05:30:55 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:44.561 05:30:55 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:44.561 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:44.561 05:30:55 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:44.561 05:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.561 [2024-11-29 05:30:55.836979] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:44.561 [2024-11-29 05:30:55.837067] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2191558 ] 00:05:44.819 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.819 [2024-11-29 05:30:55.905558] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:44.819 [2024-11-29 05:30:55.942987] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:44.819 [2024-11-29 05:30:55.943107] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:45.386 05:30:56 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:45.386 05:30:56 -- common/autotest_common.sh@862 -- # return 0 00:05:45.386 05:30:56 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:45.646 05:30:56 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2191558 00:05:45.646 05:30:56 -- common/autotest_common.sh@936 -- # '[' -z 2191558 ']' 00:05:45.646 05:30:56 -- common/autotest_common.sh@940 -- # kill -0 2191558 00:05:45.647 05:30:56 -- common/autotest_common.sh@941 -- # uname 00:05:45.647 05:30:56 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:45.647 05:30:56 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2191558 00:05:45.647 05:30:56 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:45.647 05:30:56 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:45.647 05:30:56 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2191558' 00:05:45.647 killing process with pid 2191558 00:05:45.647 05:30:56 -- common/autotest_common.sh@955 -- # kill 2191558 00:05:45.647 05:30:56 -- common/autotest_common.sh@960 -- # wait 2191558 00:05:46.214 00:05:46.214 real 0m1.599s 00:05:46.214 user 0m1.710s 00:05:46.214 sys 0m0.474s 00:05:46.214 05:30:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:46.214 05:30:57 -- common/autotest_common.sh@10 -- # set +x 00:05:46.214 ************************************ 00:05:46.214 END TEST alias_rpc 00:05:46.214 ************************************ 00:05:46.214 05:30:57 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:05:46.214 05:30:57 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:46.214 05:30:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:46.214 05:30:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:46.214 05:30:57 -- common/autotest_common.sh@10 -- # set +x 00:05:46.214 ************************************ 00:05:46.214 START TEST spdkcli_tcp 00:05:46.214 ************************************ 00:05:46.214 05:30:57 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:46.214 * Looking for test storage... 00:05:46.214 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:46.214 05:30:57 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:46.214 05:30:57 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:46.214 05:30:57 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:46.214 05:30:57 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:46.214 05:30:57 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:46.214 05:30:57 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:46.214 05:30:57 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:46.214 05:30:57 -- scripts/common.sh@335 -- # IFS=.-: 00:05:46.214 05:30:57 -- scripts/common.sh@335 -- # read -ra ver1 00:05:46.214 05:30:57 -- scripts/common.sh@336 -- # IFS=.-: 00:05:46.214 05:30:57 -- scripts/common.sh@336 -- # read -ra ver2 00:05:46.214 05:30:57 -- scripts/common.sh@337 -- # local 'op=<' 00:05:46.214 05:30:57 -- scripts/common.sh@339 -- # ver1_l=2 00:05:46.214 05:30:57 -- scripts/common.sh@340 -- # ver2_l=1 00:05:46.214 05:30:57 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:46.214 05:30:57 -- scripts/common.sh@343 -- # case "$op" in 00:05:46.214 05:30:57 -- scripts/common.sh@344 -- # : 1 00:05:46.214 05:30:57 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:46.214 05:30:57 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:46.214 05:30:57 -- scripts/common.sh@364 -- # decimal 1 00:05:46.214 05:30:57 -- scripts/common.sh@352 -- # local d=1 00:05:46.214 05:30:57 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:46.214 05:30:57 -- scripts/common.sh@354 -- # echo 1 00:05:46.214 05:30:57 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:46.214 05:30:57 -- scripts/common.sh@365 -- # decimal 2 00:05:46.214 05:30:57 -- scripts/common.sh@352 -- # local d=2 00:05:46.214 05:30:57 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:46.214 05:30:57 -- scripts/common.sh@354 -- # echo 2 00:05:46.214 05:30:57 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:46.215 05:30:57 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:46.215 05:30:57 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:46.215 05:30:57 -- scripts/common.sh@367 -- # return 0 00:05:46.215 05:30:57 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:46.215 05:30:57 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:46.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.215 --rc genhtml_branch_coverage=1 00:05:46.215 --rc genhtml_function_coverage=1 00:05:46.215 --rc genhtml_legend=1 00:05:46.215 --rc geninfo_all_blocks=1 00:05:46.215 --rc geninfo_unexecuted_blocks=1 00:05:46.215 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.215 ' 00:05:46.215 05:30:57 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:46.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.215 --rc genhtml_branch_coverage=1 00:05:46.215 --rc genhtml_function_coverage=1 00:05:46.215 --rc genhtml_legend=1 00:05:46.215 --rc geninfo_all_blocks=1 00:05:46.215 --rc geninfo_unexecuted_blocks=1 00:05:46.215 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.215 ' 00:05:46.215 05:30:57 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:46.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.215 --rc genhtml_branch_coverage=1 00:05:46.215 --rc genhtml_function_coverage=1 00:05:46.215 --rc genhtml_legend=1 00:05:46.215 --rc geninfo_all_blocks=1 00:05:46.215 --rc geninfo_unexecuted_blocks=1 00:05:46.215 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.215 ' 00:05:46.215 05:30:57 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:46.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.215 --rc genhtml_branch_coverage=1 00:05:46.215 --rc genhtml_function_coverage=1 00:05:46.215 --rc genhtml_legend=1 00:05:46.215 --rc geninfo_all_blocks=1 00:05:46.215 --rc geninfo_unexecuted_blocks=1 00:05:46.215 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.215 ' 00:05:46.215 05:30:57 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:46.215 05:30:57 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:46.215 05:30:57 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:46.215 05:30:57 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:46.215 05:30:57 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:46.215 05:30:57 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:46.215 05:30:57 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:46.215 05:30:57 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:46.215 05:30:57 -- common/autotest_common.sh@10 -- # set +x 00:05:46.215 05:30:57 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2191885 00:05:46.215 05:30:57 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:46.215 05:30:57 -- spdkcli/tcp.sh@27 -- # waitforlisten 2191885 00:05:46.215 05:30:57 -- common/autotest_common.sh@829 -- # '[' -z 2191885 ']' 00:05:46.215 05:30:57 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:46.215 05:30:57 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:46.215 05:30:57 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:46.215 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:46.215 05:30:57 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:46.215 05:30:57 -- common/autotest_common.sh@10 -- # set +x 00:05:46.215 [2024-11-29 05:30:57.484817] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:46.215 [2024-11-29 05:30:57.484908] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2191885 ] 00:05:46.475 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.475 [2024-11-29 05:30:57.550127] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:46.475 [2024-11-29 05:30:57.586738] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:46.475 [2024-11-29 05:30:57.586908] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:46.475 [2024-11-29 05:30:57.586909] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.042 05:30:58 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:47.042 05:30:58 -- common/autotest_common.sh@862 -- # return 0 00:05:47.042 05:30:58 -- spdkcli/tcp.sh@31 -- # socat_pid=2192126 00:05:47.042 05:30:58 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:47.042 05:30:58 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:47.301 [ 00:05:47.301 "spdk_get_version", 00:05:47.301 "rpc_get_methods", 00:05:47.301 "trace_get_info", 00:05:47.301 "trace_get_tpoint_group_mask", 00:05:47.301 "trace_disable_tpoint_group", 00:05:47.301 "trace_enable_tpoint_group", 00:05:47.301 "trace_clear_tpoint_mask", 00:05:47.301 "trace_set_tpoint_mask", 00:05:47.301 "vfu_tgt_set_base_path", 00:05:47.301 "framework_get_pci_devices", 00:05:47.301 "framework_get_config", 00:05:47.301 "framework_get_subsystems", 00:05:47.301 "iobuf_get_stats", 00:05:47.301 "iobuf_set_options", 00:05:47.301 "sock_set_default_impl", 00:05:47.301 "sock_impl_set_options", 00:05:47.301 "sock_impl_get_options", 00:05:47.301 "vmd_rescan", 00:05:47.301 "vmd_remove_device", 00:05:47.301 "vmd_enable", 00:05:47.301 "accel_get_stats", 00:05:47.301 "accel_set_options", 00:05:47.301 "accel_set_driver", 00:05:47.301 "accel_crypto_key_destroy", 00:05:47.301 "accel_crypto_keys_get", 00:05:47.301 "accel_crypto_key_create", 00:05:47.301 "accel_assign_opc", 00:05:47.301 "accel_get_module_info", 00:05:47.301 "accel_get_opc_assignments", 00:05:47.301 "notify_get_notifications", 00:05:47.301 "notify_get_types", 00:05:47.301 "bdev_get_histogram", 00:05:47.301 "bdev_enable_histogram", 00:05:47.301 "bdev_set_qos_limit", 00:05:47.301 "bdev_set_qd_sampling_period", 00:05:47.301 "bdev_get_bdevs", 00:05:47.301 "bdev_reset_iostat", 00:05:47.301 "bdev_get_iostat", 00:05:47.301 "bdev_examine", 00:05:47.301 "bdev_wait_for_examine", 00:05:47.301 "bdev_set_options", 00:05:47.301 "scsi_get_devices", 00:05:47.301 "thread_set_cpumask", 00:05:47.301 "framework_get_scheduler", 00:05:47.301 "framework_set_scheduler", 00:05:47.301 "framework_get_reactors", 00:05:47.301 "thread_get_io_channels", 00:05:47.301 "thread_get_pollers", 00:05:47.301 "thread_get_stats", 00:05:47.301 "framework_monitor_context_switch", 00:05:47.301 "spdk_kill_instance", 00:05:47.301 "log_enable_timestamps", 00:05:47.301 "log_get_flags", 00:05:47.301 "log_clear_flag", 00:05:47.301 "log_set_flag", 00:05:47.301 "log_get_level", 00:05:47.301 "log_set_level", 00:05:47.301 "log_get_print_level", 00:05:47.301 "log_set_print_level", 00:05:47.301 "framework_enable_cpumask_locks", 00:05:47.301 "framework_disable_cpumask_locks", 00:05:47.301 "framework_wait_init", 00:05:47.301 "framework_start_init", 00:05:47.302 "virtio_blk_create_transport", 00:05:47.302 "virtio_blk_get_transports", 00:05:47.302 "vhost_controller_set_coalescing", 00:05:47.302 "vhost_get_controllers", 00:05:47.302 "vhost_delete_controller", 00:05:47.302 "vhost_create_blk_controller", 00:05:47.302 "vhost_scsi_controller_remove_target", 00:05:47.302 "vhost_scsi_controller_add_target", 00:05:47.302 "vhost_start_scsi_controller", 00:05:47.302 "vhost_create_scsi_controller", 00:05:47.302 "ublk_recover_disk", 00:05:47.302 "ublk_get_disks", 00:05:47.302 "ublk_stop_disk", 00:05:47.302 "ublk_start_disk", 00:05:47.302 "ublk_destroy_target", 00:05:47.302 "ublk_create_target", 00:05:47.302 "nbd_get_disks", 00:05:47.302 "nbd_stop_disk", 00:05:47.302 "nbd_start_disk", 00:05:47.302 "env_dpdk_get_mem_stats", 00:05:47.302 "nvmf_subsystem_get_listeners", 00:05:47.302 "nvmf_subsystem_get_qpairs", 00:05:47.302 "nvmf_subsystem_get_controllers", 00:05:47.302 "nvmf_get_stats", 00:05:47.302 "nvmf_get_transports", 00:05:47.302 "nvmf_create_transport", 00:05:47.302 "nvmf_get_targets", 00:05:47.302 "nvmf_delete_target", 00:05:47.302 "nvmf_create_target", 00:05:47.302 "nvmf_subsystem_allow_any_host", 00:05:47.302 "nvmf_subsystem_remove_host", 00:05:47.302 "nvmf_subsystem_add_host", 00:05:47.302 "nvmf_subsystem_remove_ns", 00:05:47.302 "nvmf_subsystem_add_ns", 00:05:47.302 "nvmf_subsystem_listener_set_ana_state", 00:05:47.302 "nvmf_discovery_get_referrals", 00:05:47.302 "nvmf_discovery_remove_referral", 00:05:47.302 "nvmf_discovery_add_referral", 00:05:47.302 "nvmf_subsystem_remove_listener", 00:05:47.302 "nvmf_subsystem_add_listener", 00:05:47.302 "nvmf_delete_subsystem", 00:05:47.302 "nvmf_create_subsystem", 00:05:47.302 "nvmf_get_subsystems", 00:05:47.302 "nvmf_set_crdt", 00:05:47.302 "nvmf_set_config", 00:05:47.302 "nvmf_set_max_subsystems", 00:05:47.302 "iscsi_set_options", 00:05:47.302 "iscsi_get_auth_groups", 00:05:47.302 "iscsi_auth_group_remove_secret", 00:05:47.302 "iscsi_auth_group_add_secret", 00:05:47.302 "iscsi_delete_auth_group", 00:05:47.302 "iscsi_create_auth_group", 00:05:47.302 "iscsi_set_discovery_auth", 00:05:47.302 "iscsi_get_options", 00:05:47.302 "iscsi_target_node_request_logout", 00:05:47.302 "iscsi_target_node_set_redirect", 00:05:47.302 "iscsi_target_node_set_auth", 00:05:47.302 "iscsi_target_node_add_lun", 00:05:47.302 "iscsi_get_connections", 00:05:47.302 "iscsi_portal_group_set_auth", 00:05:47.302 "iscsi_start_portal_group", 00:05:47.302 "iscsi_delete_portal_group", 00:05:47.302 "iscsi_create_portal_group", 00:05:47.302 "iscsi_get_portal_groups", 00:05:47.302 "iscsi_delete_target_node", 00:05:47.302 "iscsi_target_node_remove_pg_ig_maps", 00:05:47.302 "iscsi_target_node_add_pg_ig_maps", 00:05:47.302 "iscsi_create_target_node", 00:05:47.302 "iscsi_get_target_nodes", 00:05:47.302 "iscsi_delete_initiator_group", 00:05:47.302 "iscsi_initiator_group_remove_initiators", 00:05:47.302 "iscsi_initiator_group_add_initiators", 00:05:47.302 "iscsi_create_initiator_group", 00:05:47.302 "iscsi_get_initiator_groups", 00:05:47.302 "vfu_virtio_create_scsi_endpoint", 00:05:47.302 "vfu_virtio_scsi_remove_target", 00:05:47.302 "vfu_virtio_scsi_add_target", 00:05:47.302 "vfu_virtio_create_blk_endpoint", 00:05:47.302 "vfu_virtio_delete_endpoint", 00:05:47.302 "iaa_scan_accel_module", 00:05:47.302 "dsa_scan_accel_module", 00:05:47.302 "ioat_scan_accel_module", 00:05:47.302 "accel_error_inject_error", 00:05:47.302 "bdev_iscsi_delete", 00:05:47.302 "bdev_iscsi_create", 00:05:47.302 "bdev_iscsi_set_options", 00:05:47.302 "bdev_virtio_attach_controller", 00:05:47.302 "bdev_virtio_scsi_get_devices", 00:05:47.302 "bdev_virtio_detach_controller", 00:05:47.302 "bdev_virtio_blk_set_hotplug", 00:05:47.302 "bdev_ftl_set_property", 00:05:47.302 "bdev_ftl_get_properties", 00:05:47.302 "bdev_ftl_get_stats", 00:05:47.302 "bdev_ftl_unmap", 00:05:47.302 "bdev_ftl_unload", 00:05:47.302 "bdev_ftl_delete", 00:05:47.302 "bdev_ftl_load", 00:05:47.302 "bdev_ftl_create", 00:05:47.302 "bdev_aio_delete", 00:05:47.302 "bdev_aio_rescan", 00:05:47.302 "bdev_aio_create", 00:05:47.302 "blobfs_create", 00:05:47.302 "blobfs_detect", 00:05:47.302 "blobfs_set_cache_size", 00:05:47.302 "bdev_zone_block_delete", 00:05:47.302 "bdev_zone_block_create", 00:05:47.302 "bdev_delay_delete", 00:05:47.302 "bdev_delay_create", 00:05:47.302 "bdev_delay_update_latency", 00:05:47.302 "bdev_split_delete", 00:05:47.302 "bdev_split_create", 00:05:47.302 "bdev_error_inject_error", 00:05:47.302 "bdev_error_delete", 00:05:47.302 "bdev_error_create", 00:05:47.302 "bdev_raid_set_options", 00:05:47.302 "bdev_raid_remove_base_bdev", 00:05:47.302 "bdev_raid_add_base_bdev", 00:05:47.302 "bdev_raid_delete", 00:05:47.302 "bdev_raid_create", 00:05:47.302 "bdev_raid_get_bdevs", 00:05:47.302 "bdev_lvol_grow_lvstore", 00:05:47.302 "bdev_lvol_get_lvols", 00:05:47.302 "bdev_lvol_get_lvstores", 00:05:47.302 "bdev_lvol_delete", 00:05:47.302 "bdev_lvol_set_read_only", 00:05:47.302 "bdev_lvol_resize", 00:05:47.302 "bdev_lvol_decouple_parent", 00:05:47.302 "bdev_lvol_inflate", 00:05:47.302 "bdev_lvol_rename", 00:05:47.302 "bdev_lvol_clone_bdev", 00:05:47.302 "bdev_lvol_clone", 00:05:47.302 "bdev_lvol_snapshot", 00:05:47.302 "bdev_lvol_create", 00:05:47.302 "bdev_lvol_delete_lvstore", 00:05:47.302 "bdev_lvol_rename_lvstore", 00:05:47.302 "bdev_lvol_create_lvstore", 00:05:47.302 "bdev_passthru_delete", 00:05:47.302 "bdev_passthru_create", 00:05:47.302 "bdev_nvme_cuse_unregister", 00:05:47.302 "bdev_nvme_cuse_register", 00:05:47.302 "bdev_opal_new_user", 00:05:47.302 "bdev_opal_set_lock_state", 00:05:47.302 "bdev_opal_delete", 00:05:47.302 "bdev_opal_get_info", 00:05:47.302 "bdev_opal_create", 00:05:47.302 "bdev_nvme_opal_revert", 00:05:47.302 "bdev_nvme_opal_init", 00:05:47.302 "bdev_nvme_send_cmd", 00:05:47.302 "bdev_nvme_get_path_iostat", 00:05:47.302 "bdev_nvme_get_mdns_discovery_info", 00:05:47.302 "bdev_nvme_stop_mdns_discovery", 00:05:47.302 "bdev_nvme_start_mdns_discovery", 00:05:47.302 "bdev_nvme_set_multipath_policy", 00:05:47.302 "bdev_nvme_set_preferred_path", 00:05:47.302 "bdev_nvme_get_io_paths", 00:05:47.302 "bdev_nvme_remove_error_injection", 00:05:47.302 "bdev_nvme_add_error_injection", 00:05:47.302 "bdev_nvme_get_discovery_info", 00:05:47.302 "bdev_nvme_stop_discovery", 00:05:47.302 "bdev_nvme_start_discovery", 00:05:47.302 "bdev_nvme_get_controller_health_info", 00:05:47.302 "bdev_nvme_disable_controller", 00:05:47.302 "bdev_nvme_enable_controller", 00:05:47.302 "bdev_nvme_reset_controller", 00:05:47.302 "bdev_nvme_get_transport_statistics", 00:05:47.302 "bdev_nvme_apply_firmware", 00:05:47.302 "bdev_nvme_detach_controller", 00:05:47.302 "bdev_nvme_get_controllers", 00:05:47.302 "bdev_nvme_attach_controller", 00:05:47.302 "bdev_nvme_set_hotplug", 00:05:47.302 "bdev_nvme_set_options", 00:05:47.302 "bdev_null_resize", 00:05:47.302 "bdev_null_delete", 00:05:47.302 "bdev_null_create", 00:05:47.302 "bdev_malloc_delete", 00:05:47.302 "bdev_malloc_create" 00:05:47.302 ] 00:05:47.302 05:30:58 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:47.302 05:30:58 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:47.302 05:30:58 -- common/autotest_common.sh@10 -- # set +x 00:05:47.302 05:30:58 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:47.302 05:30:58 -- spdkcli/tcp.sh@38 -- # killprocess 2191885 00:05:47.302 05:30:58 -- common/autotest_common.sh@936 -- # '[' -z 2191885 ']' 00:05:47.302 05:30:58 -- common/autotest_common.sh@940 -- # kill -0 2191885 00:05:47.302 05:30:58 -- common/autotest_common.sh@941 -- # uname 00:05:47.302 05:30:58 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:47.302 05:30:58 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2191885 00:05:47.302 05:30:58 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:47.302 05:30:58 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:47.302 05:30:58 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2191885' 00:05:47.302 killing process with pid 2191885 00:05:47.302 05:30:58 -- common/autotest_common.sh@955 -- # kill 2191885 00:05:47.302 05:30:58 -- common/autotest_common.sh@960 -- # wait 2191885 00:05:47.871 00:05:47.871 real 0m1.609s 00:05:47.871 user 0m2.976s 00:05:47.871 sys 0m0.487s 00:05:47.871 05:30:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:47.871 05:30:58 -- common/autotest_common.sh@10 -- # set +x 00:05:47.871 ************************************ 00:05:47.871 END TEST spdkcli_tcp 00:05:47.871 ************************************ 00:05:47.871 05:30:58 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:47.871 05:30:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:47.871 05:30:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:47.871 05:30:58 -- common/autotest_common.sh@10 -- # set +x 00:05:47.871 ************************************ 00:05:47.871 START TEST dpdk_mem_utility 00:05:47.871 ************************************ 00:05:47.871 05:30:58 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:47.871 * Looking for test storage... 00:05:47.871 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:47.871 05:30:59 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:47.871 05:30:59 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:47.871 05:30:59 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:47.871 05:30:59 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:47.871 05:30:59 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:47.871 05:30:59 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:47.871 05:30:59 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:47.871 05:30:59 -- scripts/common.sh@335 -- # IFS=.-: 00:05:47.871 05:30:59 -- scripts/common.sh@335 -- # read -ra ver1 00:05:47.871 05:30:59 -- scripts/common.sh@336 -- # IFS=.-: 00:05:47.871 05:30:59 -- scripts/common.sh@336 -- # read -ra ver2 00:05:47.871 05:30:59 -- scripts/common.sh@337 -- # local 'op=<' 00:05:47.871 05:30:59 -- scripts/common.sh@339 -- # ver1_l=2 00:05:47.871 05:30:59 -- scripts/common.sh@340 -- # ver2_l=1 00:05:47.871 05:30:59 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:47.871 05:30:59 -- scripts/common.sh@343 -- # case "$op" in 00:05:47.871 05:30:59 -- scripts/common.sh@344 -- # : 1 00:05:47.871 05:30:59 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:47.871 05:30:59 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:47.871 05:30:59 -- scripts/common.sh@364 -- # decimal 1 00:05:47.871 05:30:59 -- scripts/common.sh@352 -- # local d=1 00:05:47.871 05:30:59 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:47.871 05:30:59 -- scripts/common.sh@354 -- # echo 1 00:05:47.871 05:30:59 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:47.871 05:30:59 -- scripts/common.sh@365 -- # decimal 2 00:05:47.871 05:30:59 -- scripts/common.sh@352 -- # local d=2 00:05:47.871 05:30:59 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:47.871 05:30:59 -- scripts/common.sh@354 -- # echo 2 00:05:47.871 05:30:59 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:47.871 05:30:59 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:47.871 05:30:59 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:47.871 05:30:59 -- scripts/common.sh@367 -- # return 0 00:05:47.871 05:30:59 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:47.871 05:30:59 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:47.871 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.871 --rc genhtml_branch_coverage=1 00:05:47.871 --rc genhtml_function_coverage=1 00:05:47.871 --rc genhtml_legend=1 00:05:47.871 --rc geninfo_all_blocks=1 00:05:47.871 --rc geninfo_unexecuted_blocks=1 00:05:47.871 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.871 ' 00:05:47.871 05:30:59 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:47.871 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.871 --rc genhtml_branch_coverage=1 00:05:47.871 --rc genhtml_function_coverage=1 00:05:47.871 --rc genhtml_legend=1 00:05:47.871 --rc geninfo_all_blocks=1 00:05:47.871 --rc geninfo_unexecuted_blocks=1 00:05:47.871 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.871 ' 00:05:47.871 05:30:59 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:47.871 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.871 --rc genhtml_branch_coverage=1 00:05:47.871 --rc genhtml_function_coverage=1 00:05:47.871 --rc genhtml_legend=1 00:05:47.871 --rc geninfo_all_blocks=1 00:05:47.871 --rc geninfo_unexecuted_blocks=1 00:05:47.871 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.871 ' 00:05:47.871 05:30:59 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:47.871 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.871 --rc genhtml_branch_coverage=1 00:05:47.871 --rc genhtml_function_coverage=1 00:05:47.871 --rc genhtml_legend=1 00:05:47.871 --rc geninfo_all_blocks=1 00:05:47.871 --rc geninfo_unexecuted_blocks=1 00:05:47.871 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.871 ' 00:05:47.871 05:30:59 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:47.871 05:30:59 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2192237 00:05:47.871 05:30:59 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2192237 00:05:47.871 05:30:59 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:47.871 05:30:59 -- common/autotest_common.sh@829 -- # '[' -z 2192237 ']' 00:05:47.871 05:30:59 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:47.871 05:30:59 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:47.871 05:30:59 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:47.871 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:47.871 05:30:59 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:47.871 05:30:59 -- common/autotest_common.sh@10 -- # set +x 00:05:47.871 [2024-11-29 05:30:59.136720] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:47.871 [2024-11-29 05:30:59.136795] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2192237 ] 00:05:47.871 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.130 [2024-11-29 05:30:59.203490] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.130 [2024-11-29 05:30:59.239422] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:48.130 [2024-11-29 05:30:59.239561] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.698 05:30:59 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:48.698 05:30:59 -- common/autotest_common.sh@862 -- # return 0 00:05:48.698 05:30:59 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:48.698 05:30:59 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:48.698 05:30:59 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:48.698 05:30:59 -- common/autotest_common.sh@10 -- # set +x 00:05:48.698 { 00:05:48.698 "filename": "/tmp/spdk_mem_dump.txt" 00:05:48.698 } 00:05:48.698 05:30:59 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:48.698 05:30:59 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:48.958 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:48.958 1 heaps totaling size 814.000000 MiB 00:05:48.958 size: 814.000000 MiB heap id: 0 00:05:48.958 end heaps---------- 00:05:48.958 8 mempools totaling size 598.116089 MiB 00:05:48.958 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:48.958 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:48.958 size: 84.521057 MiB name: bdev_io_2192237 00:05:48.958 size: 51.011292 MiB name: evtpool_2192237 00:05:48.958 size: 50.003479 MiB name: msgpool_2192237 00:05:48.958 size: 21.763794 MiB name: PDU_Pool 00:05:48.958 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:48.958 size: 0.026123 MiB name: Session_Pool 00:05:48.958 end mempools------- 00:05:48.958 6 memzones totaling size 4.142822 MiB 00:05:48.958 size: 1.000366 MiB name: RG_ring_0_2192237 00:05:48.958 size: 1.000366 MiB name: RG_ring_1_2192237 00:05:48.958 size: 1.000366 MiB name: RG_ring_4_2192237 00:05:48.958 size: 1.000366 MiB name: RG_ring_5_2192237 00:05:48.958 size: 0.125366 MiB name: RG_ring_2_2192237 00:05:48.958 size: 0.015991 MiB name: RG_ring_3_2192237 00:05:48.958 end memzones------- 00:05:48.958 05:31:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:48.958 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:48.958 list of free elements. size: 12.519348 MiB 00:05:48.958 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:48.958 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:48.958 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:48.958 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:48.958 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:48.958 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:48.958 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:48.958 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:48.958 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:48.958 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:48.958 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:48.958 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:48.958 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:48.958 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:48.958 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:48.958 list of standard malloc elements. size: 199.218079 MiB 00:05:48.958 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:48.958 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:48.958 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:48.958 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:48.958 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:48.958 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:48.958 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:48.958 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:48.958 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:48.958 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:48.958 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:48.958 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:48.958 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:48.958 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:48.958 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:48.958 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:48.958 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:48.958 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:48.958 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:48.958 list of memzone associated elements. size: 602.262573 MiB 00:05:48.958 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:48.958 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:48.958 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:48.958 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:48.958 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:48.958 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2192237_0 00:05:48.958 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:48.958 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2192237_0 00:05:48.958 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:48.958 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2192237_0 00:05:48.958 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:48.958 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:48.958 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:48.958 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:48.958 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:48.958 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2192237 00:05:48.958 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:48.958 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2192237 00:05:48.958 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:48.958 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2192237 00:05:48.958 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:48.958 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:48.958 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:48.958 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:48.958 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:48.958 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:48.959 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:48.959 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:48.959 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:48.959 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2192237 00:05:48.959 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:48.959 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2192237 00:05:48.959 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:48.959 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2192237 00:05:48.959 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:48.959 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2192237 00:05:48.959 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:48.959 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2192237 00:05:48.959 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:48.959 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:48.959 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:48.959 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:48.959 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:48.959 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:48.959 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:48.959 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2192237 00:05:48.959 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:48.959 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:48.959 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:48.959 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:48.959 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:48.959 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2192237 00:05:48.959 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:48.959 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:48.959 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:48.959 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2192237 00:05:48.959 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:48.959 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2192237 00:05:48.959 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:48.959 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:48.959 05:31:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:48.959 05:31:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2192237 00:05:48.959 05:31:00 -- common/autotest_common.sh@936 -- # '[' -z 2192237 ']' 00:05:48.959 05:31:00 -- common/autotest_common.sh@940 -- # kill -0 2192237 00:05:48.959 05:31:00 -- common/autotest_common.sh@941 -- # uname 00:05:48.959 05:31:00 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:48.959 05:31:00 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2192237 00:05:48.959 05:31:00 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:48.959 05:31:00 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:48.959 05:31:00 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2192237' 00:05:48.959 killing process with pid 2192237 00:05:48.959 05:31:00 -- common/autotest_common.sh@955 -- # kill 2192237 00:05:48.959 05:31:00 -- common/autotest_common.sh@960 -- # wait 2192237 00:05:49.218 00:05:49.218 real 0m1.497s 00:05:49.218 user 0m1.518s 00:05:49.218 sys 0m0.476s 00:05:49.218 05:31:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:49.218 05:31:00 -- common/autotest_common.sh@10 -- # set +x 00:05:49.218 ************************************ 00:05:49.218 END TEST dpdk_mem_utility 00:05:49.218 ************************************ 00:05:49.218 05:31:00 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:49.218 05:31:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:49.218 05:31:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:49.218 05:31:00 -- common/autotest_common.sh@10 -- # set +x 00:05:49.218 ************************************ 00:05:49.218 START TEST event 00:05:49.218 ************************************ 00:05:49.218 05:31:00 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:49.478 * Looking for test storage... 00:05:49.478 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:49.478 05:31:00 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:49.478 05:31:00 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:49.478 05:31:00 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:49.478 05:31:00 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:49.478 05:31:00 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:49.478 05:31:00 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:49.478 05:31:00 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:49.478 05:31:00 -- scripts/common.sh@335 -- # IFS=.-: 00:05:49.478 05:31:00 -- scripts/common.sh@335 -- # read -ra ver1 00:05:49.478 05:31:00 -- scripts/common.sh@336 -- # IFS=.-: 00:05:49.478 05:31:00 -- scripts/common.sh@336 -- # read -ra ver2 00:05:49.478 05:31:00 -- scripts/common.sh@337 -- # local 'op=<' 00:05:49.478 05:31:00 -- scripts/common.sh@339 -- # ver1_l=2 00:05:49.478 05:31:00 -- scripts/common.sh@340 -- # ver2_l=1 00:05:49.478 05:31:00 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:49.478 05:31:00 -- scripts/common.sh@343 -- # case "$op" in 00:05:49.478 05:31:00 -- scripts/common.sh@344 -- # : 1 00:05:49.478 05:31:00 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:49.478 05:31:00 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:49.478 05:31:00 -- scripts/common.sh@364 -- # decimal 1 00:05:49.478 05:31:00 -- scripts/common.sh@352 -- # local d=1 00:05:49.478 05:31:00 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:49.478 05:31:00 -- scripts/common.sh@354 -- # echo 1 00:05:49.478 05:31:00 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:49.478 05:31:00 -- scripts/common.sh@365 -- # decimal 2 00:05:49.478 05:31:00 -- scripts/common.sh@352 -- # local d=2 00:05:49.478 05:31:00 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:49.478 05:31:00 -- scripts/common.sh@354 -- # echo 2 00:05:49.478 05:31:00 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:49.478 05:31:00 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:49.478 05:31:00 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:49.478 05:31:00 -- scripts/common.sh@367 -- # return 0 00:05:49.478 05:31:00 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:49.478 05:31:00 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:49.478 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:49.478 --rc genhtml_branch_coverage=1 00:05:49.478 --rc genhtml_function_coverage=1 00:05:49.478 --rc genhtml_legend=1 00:05:49.478 --rc geninfo_all_blocks=1 00:05:49.478 --rc geninfo_unexecuted_blocks=1 00:05:49.478 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:49.478 ' 00:05:49.478 05:31:00 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:49.478 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:49.478 --rc genhtml_branch_coverage=1 00:05:49.478 --rc genhtml_function_coverage=1 00:05:49.478 --rc genhtml_legend=1 00:05:49.478 --rc geninfo_all_blocks=1 00:05:49.478 --rc geninfo_unexecuted_blocks=1 00:05:49.478 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:49.478 ' 00:05:49.478 05:31:00 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:49.478 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:49.478 --rc genhtml_branch_coverage=1 00:05:49.478 --rc genhtml_function_coverage=1 00:05:49.478 --rc genhtml_legend=1 00:05:49.478 --rc geninfo_all_blocks=1 00:05:49.478 --rc geninfo_unexecuted_blocks=1 00:05:49.478 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:49.478 ' 00:05:49.478 05:31:00 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:49.478 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:49.478 --rc genhtml_branch_coverage=1 00:05:49.478 --rc genhtml_function_coverage=1 00:05:49.478 --rc genhtml_legend=1 00:05:49.478 --rc geninfo_all_blocks=1 00:05:49.478 --rc geninfo_unexecuted_blocks=1 00:05:49.478 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:49.478 ' 00:05:49.478 05:31:00 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:49.478 05:31:00 -- bdev/nbd_common.sh@6 -- # set -e 00:05:49.478 05:31:00 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:49.478 05:31:00 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:05:49.478 05:31:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:49.478 05:31:00 -- common/autotest_common.sh@10 -- # set +x 00:05:49.478 ************************************ 00:05:49.478 START TEST event_perf 00:05:49.478 ************************************ 00:05:49.478 05:31:00 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:49.478 Running I/O for 1 seconds...[2024-11-29 05:31:00.691061] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:49.478 [2024-11-29 05:31:00.691193] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2192583 ] 00:05:49.478 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.478 [2024-11-29 05:31:00.763035] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:49.737 [2024-11-29 05:31:00.803370] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:49.737 [2024-11-29 05:31:00.803463] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:49.737 [2024-11-29 05:31:00.803566] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:49.737 [2024-11-29 05:31:00.803567] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.672 Running I/O for 1 seconds... 00:05:50.672 lcore 0: 192492 00:05:50.672 lcore 1: 192491 00:05:50.672 lcore 2: 192491 00:05:50.672 lcore 3: 192493 00:05:50.672 done. 00:05:50.672 00:05:50.672 real 0m1.188s 00:05:50.672 user 0m4.085s 00:05:50.672 sys 0m0.099s 00:05:50.672 05:31:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:50.672 05:31:01 -- common/autotest_common.sh@10 -- # set +x 00:05:50.672 ************************************ 00:05:50.672 END TEST event_perf 00:05:50.672 ************************************ 00:05:50.672 05:31:01 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:50.672 05:31:01 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:50.672 05:31:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:50.672 05:31:01 -- common/autotest_common.sh@10 -- # set +x 00:05:50.672 ************************************ 00:05:50.672 START TEST event_reactor 00:05:50.672 ************************************ 00:05:50.672 05:31:01 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:50.672 [2024-11-29 05:31:01.928810] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:50.672 [2024-11-29 05:31:01.928901] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2192874 ] 00:05:50.672 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.931 [2024-11-29 05:31:01.999067] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.931 [2024-11-29 05:31:02.035146] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.868 test_start 00:05:51.868 oneshot 00:05:51.868 tick 100 00:05:51.868 tick 100 00:05:51.868 tick 250 00:05:51.868 tick 100 00:05:51.868 tick 100 00:05:51.868 tick 100 00:05:51.868 tick 250 00:05:51.868 tick 500 00:05:51.868 tick 100 00:05:51.868 tick 100 00:05:51.868 tick 250 00:05:51.868 tick 100 00:05:51.868 tick 100 00:05:51.868 test_end 00:05:51.868 00:05:51.868 real 0m1.179s 00:05:51.868 user 0m1.085s 00:05:51.868 sys 0m0.089s 00:05:51.868 05:31:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:51.868 05:31:03 -- common/autotest_common.sh@10 -- # set +x 00:05:51.868 ************************************ 00:05:51.868 END TEST event_reactor 00:05:51.868 ************************************ 00:05:51.868 05:31:03 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:51.868 05:31:03 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:51.868 05:31:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.868 05:31:03 -- common/autotest_common.sh@10 -- # set +x 00:05:51.868 ************************************ 00:05:51.868 START TEST event_reactor_perf 00:05:51.868 ************************************ 00:05:51.868 05:31:03 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:51.868 [2024-11-29 05:31:03.156567] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:51.868 [2024-11-29 05:31:03.156671] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2193158 ] 00:05:52.127 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.127 [2024-11-29 05:31:03.224664] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:52.127 [2024-11-29 05:31:03.260743] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.062 test_start 00:05:53.062 test_end 00:05:53.062 Performance: 981199 events per second 00:05:53.062 00:05:53.062 real 0m1.176s 00:05:53.062 user 0m1.088s 00:05:53.062 sys 0m0.084s 00:05:53.062 05:31:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:53.062 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.062 ************************************ 00:05:53.062 END TEST event_reactor_perf 00:05:53.062 ************************************ 00:05:53.062 05:31:04 -- event/event.sh@49 -- # uname -s 00:05:53.322 05:31:04 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:53.322 05:31:04 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:53.322 05:31:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:53.322 05:31:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:53.322 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.322 ************************************ 00:05:53.322 START TEST event_scheduler 00:05:53.322 ************************************ 00:05:53.322 05:31:04 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:53.322 * Looking for test storage... 00:05:53.322 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:53.322 05:31:04 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:53.322 05:31:04 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:53.322 05:31:04 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:53.322 05:31:04 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:53.322 05:31:04 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:53.322 05:31:04 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:53.322 05:31:04 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:53.322 05:31:04 -- scripts/common.sh@335 -- # IFS=.-: 00:05:53.322 05:31:04 -- scripts/common.sh@335 -- # read -ra ver1 00:05:53.322 05:31:04 -- scripts/common.sh@336 -- # IFS=.-: 00:05:53.322 05:31:04 -- scripts/common.sh@336 -- # read -ra ver2 00:05:53.322 05:31:04 -- scripts/common.sh@337 -- # local 'op=<' 00:05:53.322 05:31:04 -- scripts/common.sh@339 -- # ver1_l=2 00:05:53.322 05:31:04 -- scripts/common.sh@340 -- # ver2_l=1 00:05:53.322 05:31:04 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:53.322 05:31:04 -- scripts/common.sh@343 -- # case "$op" in 00:05:53.322 05:31:04 -- scripts/common.sh@344 -- # : 1 00:05:53.322 05:31:04 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:53.322 05:31:04 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:53.322 05:31:04 -- scripts/common.sh@364 -- # decimal 1 00:05:53.322 05:31:04 -- scripts/common.sh@352 -- # local d=1 00:05:53.322 05:31:04 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:53.322 05:31:04 -- scripts/common.sh@354 -- # echo 1 00:05:53.322 05:31:04 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:53.322 05:31:04 -- scripts/common.sh@365 -- # decimal 2 00:05:53.322 05:31:04 -- scripts/common.sh@352 -- # local d=2 00:05:53.322 05:31:04 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:53.322 05:31:04 -- scripts/common.sh@354 -- # echo 2 00:05:53.322 05:31:04 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:53.322 05:31:04 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:53.322 05:31:04 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:53.322 05:31:04 -- scripts/common.sh@367 -- # return 0 00:05:53.322 05:31:04 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:53.322 05:31:04 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:53.322 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.322 --rc genhtml_branch_coverage=1 00:05:53.322 --rc genhtml_function_coverage=1 00:05:53.322 --rc genhtml_legend=1 00:05:53.322 --rc geninfo_all_blocks=1 00:05:53.322 --rc geninfo_unexecuted_blocks=1 00:05:53.322 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.322 ' 00:05:53.322 05:31:04 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:53.322 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.322 --rc genhtml_branch_coverage=1 00:05:53.322 --rc genhtml_function_coverage=1 00:05:53.322 --rc genhtml_legend=1 00:05:53.322 --rc geninfo_all_blocks=1 00:05:53.322 --rc geninfo_unexecuted_blocks=1 00:05:53.322 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.322 ' 00:05:53.322 05:31:04 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:53.322 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.322 --rc genhtml_branch_coverage=1 00:05:53.323 --rc genhtml_function_coverage=1 00:05:53.323 --rc genhtml_legend=1 00:05:53.323 --rc geninfo_all_blocks=1 00:05:53.323 --rc geninfo_unexecuted_blocks=1 00:05:53.323 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.323 ' 00:05:53.323 05:31:04 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:53.323 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.323 --rc genhtml_branch_coverage=1 00:05:53.323 --rc genhtml_function_coverage=1 00:05:53.323 --rc genhtml_legend=1 00:05:53.323 --rc geninfo_all_blocks=1 00:05:53.323 --rc geninfo_unexecuted_blocks=1 00:05:53.323 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.323 ' 00:05:53.323 05:31:04 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:53.323 05:31:04 -- scheduler/scheduler.sh@35 -- # scheduler_pid=2193476 00:05:53.323 05:31:04 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:53.323 05:31:04 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:53.323 05:31:04 -- scheduler/scheduler.sh@37 -- # waitforlisten 2193476 00:05:53.323 05:31:04 -- common/autotest_common.sh@829 -- # '[' -z 2193476 ']' 00:05:53.323 05:31:04 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:53.323 05:31:04 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:53.323 05:31:04 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:53.323 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:53.323 05:31:04 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:53.323 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.323 [2024-11-29 05:31:04.585141] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:53.323 [2024-11-29 05:31:04.585220] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2193476 ] 00:05:53.323 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.581 [2024-11-29 05:31:04.649709] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:53.581 [2024-11-29 05:31:04.690067] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.581 [2024-11-29 05:31:04.690152] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:53.581 [2024-11-29 05:31:04.690256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:53.581 [2024-11-29 05:31:04.690258] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:53.581 05:31:04 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:53.581 05:31:04 -- common/autotest_common.sh@862 -- # return 0 00:05:53.581 05:31:04 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:53.581 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.581 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.581 POWER: Env isn't set yet! 00:05:53.581 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:53.581 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:53.581 POWER: Cannot set governor of lcore 0 to userspace 00:05:53.581 POWER: Attempting to initialise PSTAT power management... 00:05:53.581 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:53.581 POWER: Initialized successfully for lcore 0 power management 00:05:53.581 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:53.582 POWER: Initialized successfully for lcore 1 power management 00:05:53.582 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:53.582 POWER: Initialized successfully for lcore 2 power management 00:05:53.582 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:53.582 POWER: Initialized successfully for lcore 3 power management 00:05:53.582 [2024-11-29 05:31:04.789768] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:53.582 [2024-11-29 05:31:04.789783] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:53.582 [2024-11-29 05:31:04.789794] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:53.582 05:31:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.582 05:31:04 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:53.582 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.582 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.582 [2024-11-29 05:31:04.852027] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:53.582 05:31:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.582 05:31:04 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:53.582 05:31:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:53.582 05:31:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:53.582 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.582 ************************************ 00:05:53.582 START TEST scheduler_create_thread 00:05:53.582 ************************************ 00:05:53.582 05:31:04 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:05:53.582 05:31:04 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:53.582 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.582 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.582 2 00:05:53.582 05:31:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.582 05:31:04 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:53.582 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.582 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.840 3 00:05:53.840 05:31:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.840 05:31:04 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:53.840 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.840 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.840 4 00:05:53.840 05:31:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.840 05:31:04 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:53.840 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.840 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.840 5 00:05:53.840 05:31:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.840 05:31:04 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:53.840 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.840 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.840 6 00:05:53.840 05:31:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.840 05:31:04 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:53.840 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.840 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.840 7 00:05:53.840 05:31:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.840 05:31:04 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:53.840 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.840 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.840 8 00:05:53.840 05:31:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.840 05:31:04 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:53.840 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.840 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.840 9 00:05:53.840 05:31:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.841 05:31:04 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:53.841 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.841 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.841 10 00:05:53.841 05:31:04 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.841 05:31:04 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:53.841 05:31:04 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.841 05:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:54.774 05:31:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:54.774 05:31:05 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:54.774 05:31:05 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:54.774 05:31:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:54.774 05:31:05 -- common/autotest_common.sh@10 -- # set +x 00:05:55.709 05:31:06 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:55.709 05:31:06 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:55.709 05:31:06 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:55.709 05:31:06 -- common/autotest_common.sh@10 -- # set +x 00:05:56.644 05:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.644 05:31:07 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:56.644 05:31:07 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:56.644 05:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.644 05:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:57.210 05:31:08 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.210 00:05:57.210 real 0m3.558s 00:05:57.210 user 0m0.026s 00:05:57.210 sys 0m0.006s 00:05:57.210 05:31:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:57.210 05:31:08 -- common/autotest_common.sh@10 -- # set +x 00:05:57.210 ************************************ 00:05:57.210 END TEST scheduler_create_thread 00:05:57.210 ************************************ 00:05:57.211 05:31:08 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:57.211 05:31:08 -- scheduler/scheduler.sh@46 -- # killprocess 2193476 00:05:57.211 05:31:08 -- common/autotest_common.sh@936 -- # '[' -z 2193476 ']' 00:05:57.211 05:31:08 -- common/autotest_common.sh@940 -- # kill -0 2193476 00:05:57.211 05:31:08 -- common/autotest_common.sh@941 -- # uname 00:05:57.211 05:31:08 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:57.211 05:31:08 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2193476 00:05:57.469 05:31:08 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:57.469 05:31:08 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:57.469 05:31:08 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2193476' 00:05:57.469 killing process with pid 2193476 00:05:57.469 05:31:08 -- common/autotest_common.sh@955 -- # kill 2193476 00:05:57.469 05:31:08 -- common/autotest_common.sh@960 -- # wait 2193476 00:05:57.728 [2024-11-29 05:31:08.800225] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:57.728 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:57.728 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:57.728 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:57.728 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:57.728 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:57.728 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:57.728 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:57.728 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:57.987 00:05:57.987 real 0m4.662s 00:05:57.987 user 0m8.440s 00:05:57.987 sys 0m0.405s 00:05:57.987 05:31:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:57.987 05:31:09 -- common/autotest_common.sh@10 -- # set +x 00:05:57.987 ************************************ 00:05:57.987 END TEST event_scheduler 00:05:57.987 ************************************ 00:05:57.987 05:31:09 -- event/event.sh@51 -- # modprobe -n nbd 00:05:57.987 05:31:09 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:57.987 05:31:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:57.987 05:31:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:57.987 05:31:09 -- common/autotest_common.sh@10 -- # set +x 00:05:57.987 ************************************ 00:05:57.987 START TEST app_repeat 00:05:57.987 ************************************ 00:05:57.987 05:31:09 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:05:57.987 05:31:09 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:57.987 05:31:09 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:57.987 05:31:09 -- event/event.sh@13 -- # local nbd_list 00:05:57.987 05:31:09 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:57.987 05:31:09 -- event/event.sh@14 -- # local bdev_list 00:05:57.987 05:31:09 -- event/event.sh@15 -- # local repeat_times=4 00:05:57.987 05:31:09 -- event/event.sh@17 -- # modprobe nbd 00:05:57.987 05:31:09 -- event/event.sh@19 -- # repeat_pid=2194336 00:05:57.987 05:31:09 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:57.987 05:31:09 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:57.987 05:31:09 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2194336' 00:05:57.987 Process app_repeat pid: 2194336 00:05:57.987 05:31:09 -- event/event.sh@23 -- # for i in {0..2} 00:05:57.987 05:31:09 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:57.987 spdk_app_start Round 0 00:05:57.987 05:31:09 -- event/event.sh@25 -- # waitforlisten 2194336 /var/tmp/spdk-nbd.sock 00:05:57.987 05:31:09 -- common/autotest_common.sh@829 -- # '[' -z 2194336 ']' 00:05:57.987 05:31:09 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:57.987 05:31:09 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:57.987 05:31:09 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:57.987 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:57.987 05:31:09 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:57.987 05:31:09 -- common/autotest_common.sh@10 -- # set +x 00:05:57.987 [2024-11-29 05:31:09.122284] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:57.987 [2024-11-29 05:31:09.122373] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2194336 ] 00:05:57.987 EAL: No free 2048 kB hugepages reported on node 1 00:05:57.987 [2024-11-29 05:31:09.189832] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:57.987 [2024-11-29 05:31:09.228251] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:57.988 [2024-11-29 05:31:09.228253] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.953 05:31:09 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:58.953 05:31:09 -- common/autotest_common.sh@862 -- # return 0 00:05:58.953 05:31:09 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:58.953 Malloc0 00:05:58.953 05:31:10 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:59.210 Malloc1 00:05:59.210 05:31:10 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@12 -- # local i 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:59.210 /dev/nbd0 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:59.210 05:31:10 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:59.210 05:31:10 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:59.210 05:31:10 -- common/autotest_common.sh@867 -- # local i 00:05:59.210 05:31:10 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:59.210 05:31:10 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:59.210 05:31:10 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:59.467 05:31:10 -- common/autotest_common.sh@871 -- # break 00:05:59.467 05:31:10 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:59.467 05:31:10 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:59.467 05:31:10 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:59.467 1+0 records in 00:05:59.467 1+0 records out 00:05:59.467 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224472 s, 18.2 MB/s 00:05:59.467 05:31:10 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.467 05:31:10 -- common/autotest_common.sh@884 -- # size=4096 00:05:59.467 05:31:10 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.467 05:31:10 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:59.467 05:31:10 -- common/autotest_common.sh@887 -- # return 0 00:05:59.467 05:31:10 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:59.467 05:31:10 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:59.467 05:31:10 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:59.467 /dev/nbd1 00:05:59.467 05:31:10 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:59.467 05:31:10 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:59.467 05:31:10 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:59.467 05:31:10 -- common/autotest_common.sh@867 -- # local i 00:05:59.467 05:31:10 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:59.467 05:31:10 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:59.467 05:31:10 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:59.467 05:31:10 -- common/autotest_common.sh@871 -- # break 00:05:59.467 05:31:10 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:59.467 05:31:10 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:59.467 05:31:10 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:59.467 1+0 records in 00:05:59.467 1+0 records out 00:05:59.467 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255363 s, 16.0 MB/s 00:05:59.467 05:31:10 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.467 05:31:10 -- common/autotest_common.sh@884 -- # size=4096 00:05:59.467 05:31:10 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:59.467 05:31:10 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:59.467 05:31:10 -- common/autotest_common.sh@887 -- # return 0 00:05:59.467 05:31:10 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:59.467 05:31:10 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:59.467 05:31:10 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:59.467 05:31:10 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.467 05:31:10 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:59.725 { 00:05:59.725 "nbd_device": "/dev/nbd0", 00:05:59.725 "bdev_name": "Malloc0" 00:05:59.725 }, 00:05:59.725 { 00:05:59.725 "nbd_device": "/dev/nbd1", 00:05:59.725 "bdev_name": "Malloc1" 00:05:59.725 } 00:05:59.725 ]' 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:59.725 { 00:05:59.725 "nbd_device": "/dev/nbd0", 00:05:59.725 "bdev_name": "Malloc0" 00:05:59.725 }, 00:05:59.725 { 00:05:59.725 "nbd_device": "/dev/nbd1", 00:05:59.725 "bdev_name": "Malloc1" 00:05:59.725 } 00:05:59.725 ]' 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:59.725 /dev/nbd1' 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:59.725 /dev/nbd1' 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@65 -- # count=2 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@95 -- # count=2 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:59.725 256+0 records in 00:05:59.725 256+0 records out 00:05:59.725 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108835 s, 96.3 MB/s 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:59.725 05:31:10 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:59.725 256+0 records in 00:05:59.725 256+0 records out 00:05:59.725 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0192477 s, 54.5 MB/s 00:05:59.725 05:31:11 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:59.725 05:31:11 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:59.983 256+0 records in 00:05:59.983 256+0 records out 00:05:59.983 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.020841 s, 50.3 MB/s 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@51 -- # local i 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@41 -- # break 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@45 -- # return 0 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:59.983 05:31:11 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:00.266 05:31:11 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:00.266 05:31:11 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:00.266 05:31:11 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:00.266 05:31:11 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:00.266 05:31:11 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:00.266 05:31:11 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:00.266 05:31:11 -- bdev/nbd_common.sh@41 -- # break 00:06:00.266 05:31:11 -- bdev/nbd_common.sh@45 -- # return 0 00:06:00.266 05:31:11 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:00.266 05:31:11 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:00.266 05:31:11 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@65 -- # true 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@65 -- # count=0 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@104 -- # count=0 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:00.524 05:31:11 -- bdev/nbd_common.sh@109 -- # return 0 00:06:00.524 05:31:11 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:00.781 05:31:11 -- event/event.sh@35 -- # sleep 3 00:06:00.781 [2024-11-29 05:31:12.046529] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:00.781 [2024-11-29 05:31:12.079409] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:00.781 [2024-11-29 05:31:12.079411] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.038 [2024-11-29 05:31:12.119140] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:01.038 [2024-11-29 05:31:12.119188] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:04.318 05:31:14 -- event/event.sh@23 -- # for i in {0..2} 00:06:04.318 05:31:14 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:04.318 spdk_app_start Round 1 00:06:04.318 05:31:14 -- event/event.sh@25 -- # waitforlisten 2194336 /var/tmp/spdk-nbd.sock 00:06:04.318 05:31:14 -- common/autotest_common.sh@829 -- # '[' -z 2194336 ']' 00:06:04.318 05:31:14 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:04.318 05:31:14 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:04.318 05:31:14 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:04.318 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:04.318 05:31:14 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:04.318 05:31:14 -- common/autotest_common.sh@10 -- # set +x 00:06:04.318 05:31:15 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:04.318 05:31:15 -- common/autotest_common.sh@862 -- # return 0 00:06:04.318 05:31:15 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:04.318 Malloc0 00:06:04.318 05:31:15 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:04.318 Malloc1 00:06:04.318 05:31:15 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@12 -- # local i 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:04.318 05:31:15 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:04.577 /dev/nbd0 00:06:04.577 05:31:15 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:04.577 05:31:15 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:04.577 05:31:15 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:04.577 05:31:15 -- common/autotest_common.sh@867 -- # local i 00:06:04.577 05:31:15 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:04.577 05:31:15 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:04.577 05:31:15 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:04.577 05:31:15 -- common/autotest_common.sh@871 -- # break 00:06:04.577 05:31:15 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:04.577 05:31:15 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:04.577 05:31:15 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:04.577 1+0 records in 00:06:04.577 1+0 records out 00:06:04.577 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238807 s, 17.2 MB/s 00:06:04.577 05:31:15 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.577 05:31:15 -- common/autotest_common.sh@884 -- # size=4096 00:06:04.577 05:31:15 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.577 05:31:15 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:04.577 05:31:15 -- common/autotest_common.sh@887 -- # return 0 00:06:04.577 05:31:15 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:04.577 05:31:15 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:04.577 05:31:15 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:04.577 /dev/nbd1 00:06:04.577 05:31:15 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:04.577 05:31:15 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:04.577 05:31:15 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:04.577 05:31:15 -- common/autotest_common.sh@867 -- # local i 00:06:04.577 05:31:15 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:04.577 05:31:15 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:04.577 05:31:15 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:04.577 05:31:15 -- common/autotest_common.sh@871 -- # break 00:06:04.577 05:31:15 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:04.577 05:31:15 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:04.577 05:31:15 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:04.577 1+0 records in 00:06:04.577 1+0 records out 00:06:04.577 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242148 s, 16.9 MB/s 00:06:04.577 05:31:15 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.836 05:31:15 -- common/autotest_common.sh@884 -- # size=4096 00:06:04.836 05:31:15 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.836 05:31:15 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:04.836 05:31:15 -- common/autotest_common.sh@887 -- # return 0 00:06:04.836 05:31:15 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:04.836 05:31:15 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:04.836 05:31:15 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:04.836 05:31:15 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.836 05:31:15 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:04.836 { 00:06:04.836 "nbd_device": "/dev/nbd0", 00:06:04.836 "bdev_name": "Malloc0" 00:06:04.836 }, 00:06:04.836 { 00:06:04.836 "nbd_device": "/dev/nbd1", 00:06:04.836 "bdev_name": "Malloc1" 00:06:04.836 } 00:06:04.836 ]' 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:04.836 { 00:06:04.836 "nbd_device": "/dev/nbd0", 00:06:04.836 "bdev_name": "Malloc0" 00:06:04.836 }, 00:06:04.836 { 00:06:04.836 "nbd_device": "/dev/nbd1", 00:06:04.836 "bdev_name": "Malloc1" 00:06:04.836 } 00:06:04.836 ]' 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:04.836 /dev/nbd1' 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:04.836 /dev/nbd1' 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@65 -- # count=2 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@95 -- # count=2 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:04.836 256+0 records in 00:06:04.836 256+0 records out 00:06:04.836 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00446577 s, 235 MB/s 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:04.836 05:31:16 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:05.095 256+0 records in 00:06:05.095 256+0 records out 00:06:05.095 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197711 s, 53.0 MB/s 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:05.095 256+0 records in 00:06:05.095 256+0 records out 00:06:05.095 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0208833 s, 50.2 MB/s 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@51 -- # local i 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@41 -- # break 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@45 -- # return 0 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:05.095 05:31:16 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:05.353 05:31:16 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:05.353 05:31:16 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:05.354 05:31:16 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:05.354 05:31:16 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:05.354 05:31:16 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:05.354 05:31:16 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:05.354 05:31:16 -- bdev/nbd_common.sh@41 -- # break 00:06:05.354 05:31:16 -- bdev/nbd_common.sh@45 -- # return 0 00:06:05.354 05:31:16 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:05.354 05:31:16 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:05.354 05:31:16 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@65 -- # true 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@65 -- # count=0 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@104 -- # count=0 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:05.612 05:31:16 -- bdev/nbd_common.sh@109 -- # return 0 00:06:05.612 05:31:16 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:05.872 05:31:17 -- event/event.sh@35 -- # sleep 3 00:06:06.131 [2024-11-29 05:31:17.182261] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:06.131 [2024-11-29 05:31:17.215226] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:06.131 [2024-11-29 05:31:17.215228] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.131 [2024-11-29 05:31:17.254823] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:06.131 [2024-11-29 05:31:17.254870] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:09.417 05:31:20 -- event/event.sh@23 -- # for i in {0..2} 00:06:09.417 05:31:20 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:09.417 spdk_app_start Round 2 00:06:09.417 05:31:20 -- event/event.sh@25 -- # waitforlisten 2194336 /var/tmp/spdk-nbd.sock 00:06:09.417 05:31:20 -- common/autotest_common.sh@829 -- # '[' -z 2194336 ']' 00:06:09.417 05:31:20 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:09.417 05:31:20 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:09.417 05:31:20 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:09.417 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:09.417 05:31:20 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:09.417 05:31:20 -- common/autotest_common.sh@10 -- # set +x 00:06:09.417 05:31:20 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:09.417 05:31:20 -- common/autotest_common.sh@862 -- # return 0 00:06:09.417 05:31:20 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:09.417 Malloc0 00:06:09.417 05:31:20 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:09.417 Malloc1 00:06:09.417 05:31:20 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@12 -- # local i 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:09.417 05:31:20 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:09.676 /dev/nbd0 00:06:09.676 05:31:20 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:09.676 05:31:20 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:09.676 05:31:20 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:09.676 05:31:20 -- common/autotest_common.sh@867 -- # local i 00:06:09.677 05:31:20 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:09.677 05:31:20 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:09.677 05:31:20 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:09.677 05:31:20 -- common/autotest_common.sh@871 -- # break 00:06:09.677 05:31:20 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:09.677 05:31:20 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:09.677 05:31:20 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:09.677 1+0 records in 00:06:09.677 1+0 records out 00:06:09.677 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000223816 s, 18.3 MB/s 00:06:09.677 05:31:20 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:09.677 05:31:20 -- common/autotest_common.sh@884 -- # size=4096 00:06:09.677 05:31:20 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:09.677 05:31:20 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:09.677 05:31:20 -- common/autotest_common.sh@887 -- # return 0 00:06:09.677 05:31:20 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:09.677 05:31:20 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:09.677 05:31:20 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:09.677 /dev/nbd1 00:06:09.677 05:31:20 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:09.677 05:31:20 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:09.677 05:31:20 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:09.677 05:31:20 -- common/autotest_common.sh@867 -- # local i 00:06:09.677 05:31:20 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:09.677 05:31:20 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:09.677 05:31:20 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:09.677 05:31:20 -- common/autotest_common.sh@871 -- # break 00:06:09.677 05:31:20 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:09.677 05:31:20 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:09.677 05:31:20 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:09.677 1+0 records in 00:06:09.677 1+0 records out 00:06:09.677 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250909 s, 16.3 MB/s 00:06:09.677 05:31:20 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:09.936 05:31:20 -- common/autotest_common.sh@884 -- # size=4096 00:06:09.936 05:31:20 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:09.936 05:31:20 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:09.936 05:31:20 -- common/autotest_common.sh@887 -- # return 0 00:06:09.936 05:31:20 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:09.936 05:31:20 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:09.936 05:31:20 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:09.936 05:31:20 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:09.936 05:31:20 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:09.936 { 00:06:09.936 "nbd_device": "/dev/nbd0", 00:06:09.936 "bdev_name": "Malloc0" 00:06:09.936 }, 00:06:09.936 { 00:06:09.936 "nbd_device": "/dev/nbd1", 00:06:09.936 "bdev_name": "Malloc1" 00:06:09.936 } 00:06:09.936 ]' 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:09.936 { 00:06:09.936 "nbd_device": "/dev/nbd0", 00:06:09.936 "bdev_name": "Malloc0" 00:06:09.936 }, 00:06:09.936 { 00:06:09.936 "nbd_device": "/dev/nbd1", 00:06:09.936 "bdev_name": "Malloc1" 00:06:09.936 } 00:06:09.936 ]' 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:09.936 /dev/nbd1' 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:09.936 /dev/nbd1' 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@65 -- # count=2 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@95 -- # count=2 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:09.936 256+0 records in 00:06:09.936 256+0 records out 00:06:09.936 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010445 s, 100 MB/s 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:09.936 05:31:21 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:10.195 256+0 records in 00:06:10.195 256+0 records out 00:06:10.195 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197426 s, 53.1 MB/s 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:10.195 256+0 records in 00:06:10.195 256+0 records out 00:06:10.195 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0206707 s, 50.7 MB/s 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@51 -- # local i 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:10.195 05:31:21 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@41 -- # break 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@45 -- # return 0 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@41 -- # break 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@45 -- # return 0 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:10.454 05:31:21 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@65 -- # true 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@65 -- # count=0 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@104 -- # count=0 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:10.712 05:31:21 -- bdev/nbd_common.sh@109 -- # return 0 00:06:10.712 05:31:21 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:10.971 05:31:22 -- event/event.sh@35 -- # sleep 3 00:06:11.230 [2024-11-29 05:31:22.289065] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:11.230 [2024-11-29 05:31:22.322276] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:11.230 [2024-11-29 05:31:22.322278] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.230 [2024-11-29 05:31:22.361929] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:11.230 [2024-11-29 05:31:22.361975] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:14.514 05:31:25 -- event/event.sh@38 -- # waitforlisten 2194336 /var/tmp/spdk-nbd.sock 00:06:14.514 05:31:25 -- common/autotest_common.sh@829 -- # '[' -z 2194336 ']' 00:06:14.514 05:31:25 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:14.514 05:31:25 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.514 05:31:25 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:14.514 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:14.514 05:31:25 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.514 05:31:25 -- common/autotest_common.sh@10 -- # set +x 00:06:14.514 05:31:25 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:14.514 05:31:25 -- common/autotest_common.sh@862 -- # return 0 00:06:14.514 05:31:25 -- event/event.sh@39 -- # killprocess 2194336 00:06:14.514 05:31:25 -- common/autotest_common.sh@936 -- # '[' -z 2194336 ']' 00:06:14.514 05:31:25 -- common/autotest_common.sh@940 -- # kill -0 2194336 00:06:14.514 05:31:25 -- common/autotest_common.sh@941 -- # uname 00:06:14.514 05:31:25 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:14.514 05:31:25 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2194336 00:06:14.514 05:31:25 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:14.514 05:31:25 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:14.514 05:31:25 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2194336' 00:06:14.514 killing process with pid 2194336 00:06:14.514 05:31:25 -- common/autotest_common.sh@955 -- # kill 2194336 00:06:14.514 05:31:25 -- common/autotest_common.sh@960 -- # wait 2194336 00:06:14.514 spdk_app_start is called in Round 0. 00:06:14.514 Shutdown signal received, stop current app iteration 00:06:14.514 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:14.514 spdk_app_start is called in Round 1. 00:06:14.514 Shutdown signal received, stop current app iteration 00:06:14.514 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:14.514 spdk_app_start is called in Round 2. 00:06:14.514 Shutdown signal received, stop current app iteration 00:06:14.514 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:14.514 spdk_app_start is called in Round 3. 00:06:14.514 Shutdown signal received, stop current app iteration 00:06:14.514 05:31:25 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:14.514 05:31:25 -- event/event.sh@42 -- # return 0 00:06:14.514 00:06:14.514 real 0m16.420s 00:06:14.514 user 0m35.163s 00:06:14.514 sys 0m3.091s 00:06:14.514 05:31:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:14.514 05:31:25 -- common/autotest_common.sh@10 -- # set +x 00:06:14.514 ************************************ 00:06:14.514 END TEST app_repeat 00:06:14.514 ************************************ 00:06:14.514 05:31:25 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:14.514 05:31:25 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:14.514 05:31:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:14.514 05:31:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:14.514 05:31:25 -- common/autotest_common.sh@10 -- # set +x 00:06:14.514 ************************************ 00:06:14.514 START TEST cpu_locks 00:06:14.514 ************************************ 00:06:14.514 05:31:25 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:14.514 * Looking for test storage... 00:06:14.514 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:14.514 05:31:25 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:14.514 05:31:25 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:14.514 05:31:25 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:14.514 05:31:25 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:14.514 05:31:25 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:14.514 05:31:25 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:14.514 05:31:25 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:14.514 05:31:25 -- scripts/common.sh@335 -- # IFS=.-: 00:06:14.514 05:31:25 -- scripts/common.sh@335 -- # read -ra ver1 00:06:14.514 05:31:25 -- scripts/common.sh@336 -- # IFS=.-: 00:06:14.514 05:31:25 -- scripts/common.sh@336 -- # read -ra ver2 00:06:14.514 05:31:25 -- scripts/common.sh@337 -- # local 'op=<' 00:06:14.514 05:31:25 -- scripts/common.sh@339 -- # ver1_l=2 00:06:14.515 05:31:25 -- scripts/common.sh@340 -- # ver2_l=1 00:06:14.515 05:31:25 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:14.515 05:31:25 -- scripts/common.sh@343 -- # case "$op" in 00:06:14.515 05:31:25 -- scripts/common.sh@344 -- # : 1 00:06:14.515 05:31:25 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:14.515 05:31:25 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:14.515 05:31:25 -- scripts/common.sh@364 -- # decimal 1 00:06:14.515 05:31:25 -- scripts/common.sh@352 -- # local d=1 00:06:14.515 05:31:25 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:14.515 05:31:25 -- scripts/common.sh@354 -- # echo 1 00:06:14.515 05:31:25 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:14.515 05:31:25 -- scripts/common.sh@365 -- # decimal 2 00:06:14.515 05:31:25 -- scripts/common.sh@352 -- # local d=2 00:06:14.515 05:31:25 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:14.515 05:31:25 -- scripts/common.sh@354 -- # echo 2 00:06:14.515 05:31:25 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:14.515 05:31:25 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:14.515 05:31:25 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:14.515 05:31:25 -- scripts/common.sh@367 -- # return 0 00:06:14.515 05:31:25 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:14.515 05:31:25 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:14.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.515 --rc genhtml_branch_coverage=1 00:06:14.515 --rc genhtml_function_coverage=1 00:06:14.515 --rc genhtml_legend=1 00:06:14.515 --rc geninfo_all_blocks=1 00:06:14.515 --rc geninfo_unexecuted_blocks=1 00:06:14.515 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.515 ' 00:06:14.515 05:31:25 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:14.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.515 --rc genhtml_branch_coverage=1 00:06:14.515 --rc genhtml_function_coverage=1 00:06:14.515 --rc genhtml_legend=1 00:06:14.515 --rc geninfo_all_blocks=1 00:06:14.515 --rc geninfo_unexecuted_blocks=1 00:06:14.515 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.515 ' 00:06:14.515 05:31:25 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:14.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.515 --rc genhtml_branch_coverage=1 00:06:14.515 --rc genhtml_function_coverage=1 00:06:14.515 --rc genhtml_legend=1 00:06:14.515 --rc geninfo_all_blocks=1 00:06:14.515 --rc geninfo_unexecuted_blocks=1 00:06:14.515 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.515 ' 00:06:14.515 05:31:25 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:14.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.515 --rc genhtml_branch_coverage=1 00:06:14.515 --rc genhtml_function_coverage=1 00:06:14.515 --rc genhtml_legend=1 00:06:14.515 --rc geninfo_all_blocks=1 00:06:14.515 --rc geninfo_unexecuted_blocks=1 00:06:14.515 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.515 ' 00:06:14.515 05:31:25 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:14.515 05:31:25 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:14.515 05:31:25 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:14.515 05:31:25 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:14.515 05:31:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:14.515 05:31:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:14.515 05:31:25 -- common/autotest_common.sh@10 -- # set +x 00:06:14.515 ************************************ 00:06:14.515 START TEST default_locks 00:06:14.515 ************************************ 00:06:14.515 05:31:25 -- common/autotest_common.sh@1114 -- # default_locks 00:06:14.515 05:31:25 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=2197474 00:06:14.515 05:31:25 -- event/cpu_locks.sh@47 -- # waitforlisten 2197474 00:06:14.515 05:31:25 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:14.515 05:31:25 -- common/autotest_common.sh@829 -- # '[' -z 2197474 ']' 00:06:14.515 05:31:25 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.515 05:31:25 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.515 05:31:25 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.515 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.515 05:31:25 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.515 05:31:25 -- common/autotest_common.sh@10 -- # set +x 00:06:14.515 [2024-11-29 05:31:25.787207] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:14.515 [2024-11-29 05:31:25.787306] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2197474 ] 00:06:14.774 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.774 [2024-11-29 05:31:25.854635] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.774 [2024-11-29 05:31:25.892668] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:14.774 [2024-11-29 05:31:25.892788] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.339 05:31:26 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:15.339 05:31:26 -- common/autotest_common.sh@862 -- # return 0 00:06:15.339 05:31:26 -- event/cpu_locks.sh@49 -- # locks_exist 2197474 00:06:15.339 05:31:26 -- event/cpu_locks.sh@22 -- # lslocks -p 2197474 00:06:15.339 05:31:26 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:15.905 lslocks: write error 00:06:15.905 05:31:27 -- event/cpu_locks.sh@50 -- # killprocess 2197474 00:06:15.905 05:31:27 -- common/autotest_common.sh@936 -- # '[' -z 2197474 ']' 00:06:15.905 05:31:27 -- common/autotest_common.sh@940 -- # kill -0 2197474 00:06:15.905 05:31:27 -- common/autotest_common.sh@941 -- # uname 00:06:15.905 05:31:27 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:15.905 05:31:27 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2197474 00:06:15.905 05:31:27 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:15.905 05:31:27 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:15.905 05:31:27 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2197474' 00:06:15.905 killing process with pid 2197474 00:06:15.905 05:31:27 -- common/autotest_common.sh@955 -- # kill 2197474 00:06:15.905 05:31:27 -- common/autotest_common.sh@960 -- # wait 2197474 00:06:16.165 05:31:27 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 2197474 00:06:16.165 05:31:27 -- common/autotest_common.sh@650 -- # local es=0 00:06:16.165 05:31:27 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 2197474 00:06:16.165 05:31:27 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:16.165 05:31:27 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:16.165 05:31:27 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:16.165 05:31:27 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:16.165 05:31:27 -- common/autotest_common.sh@653 -- # waitforlisten 2197474 00:06:16.165 05:31:27 -- common/autotest_common.sh@829 -- # '[' -z 2197474 ']' 00:06:16.165 05:31:27 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:16.165 05:31:27 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:16.165 05:31:27 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:16.165 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:16.165 05:31:27 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:16.165 05:31:27 -- common/autotest_common.sh@10 -- # set +x 00:06:16.165 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (2197474) - No such process 00:06:16.165 ERROR: process (pid: 2197474) is no longer running 00:06:16.165 05:31:27 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:16.165 05:31:27 -- common/autotest_common.sh@862 -- # return 1 00:06:16.165 05:31:27 -- common/autotest_common.sh@653 -- # es=1 00:06:16.165 05:31:27 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:16.165 05:31:27 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:16.165 05:31:27 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:16.165 05:31:27 -- event/cpu_locks.sh@54 -- # no_locks 00:06:16.165 05:31:27 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:16.165 05:31:27 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:16.165 05:31:27 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:16.165 00:06:16.165 real 0m1.699s 00:06:16.165 user 0m1.792s 00:06:16.165 sys 0m0.591s 00:06:16.165 05:31:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:16.165 05:31:27 -- common/autotest_common.sh@10 -- # set +x 00:06:16.165 ************************************ 00:06:16.165 END TEST default_locks 00:06:16.165 ************************************ 00:06:16.424 05:31:27 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:16.424 05:31:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:16.424 05:31:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:16.424 05:31:27 -- common/autotest_common.sh@10 -- # set +x 00:06:16.424 ************************************ 00:06:16.424 START TEST default_locks_via_rpc 00:06:16.424 ************************************ 00:06:16.424 05:31:27 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:06:16.424 05:31:27 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=2197837 00:06:16.424 05:31:27 -- event/cpu_locks.sh@63 -- # waitforlisten 2197837 00:06:16.424 05:31:27 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:16.424 05:31:27 -- common/autotest_common.sh@829 -- # '[' -z 2197837 ']' 00:06:16.424 05:31:27 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:16.424 05:31:27 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:16.424 05:31:27 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:16.424 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:16.424 05:31:27 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:16.424 05:31:27 -- common/autotest_common.sh@10 -- # set +x 00:06:16.424 [2024-11-29 05:31:27.535346] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:16.424 [2024-11-29 05:31:27.535421] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2197837 ] 00:06:16.424 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.424 [2024-11-29 05:31:27.602379] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.424 [2024-11-29 05:31:27.640360] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.424 [2024-11-29 05:31:27.640480] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.358 05:31:28 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:17.358 05:31:28 -- common/autotest_common.sh@862 -- # return 0 00:06:17.358 05:31:28 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:17.358 05:31:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:17.358 05:31:28 -- common/autotest_common.sh@10 -- # set +x 00:06:17.358 05:31:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:17.358 05:31:28 -- event/cpu_locks.sh@67 -- # no_locks 00:06:17.358 05:31:28 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:17.358 05:31:28 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:17.358 05:31:28 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:17.358 05:31:28 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:17.358 05:31:28 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:17.358 05:31:28 -- common/autotest_common.sh@10 -- # set +x 00:06:17.358 05:31:28 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:17.358 05:31:28 -- event/cpu_locks.sh@71 -- # locks_exist 2197837 00:06:17.358 05:31:28 -- event/cpu_locks.sh@22 -- # lslocks -p 2197837 00:06:17.358 05:31:28 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:17.617 05:31:28 -- event/cpu_locks.sh@73 -- # killprocess 2197837 00:06:17.617 05:31:28 -- common/autotest_common.sh@936 -- # '[' -z 2197837 ']' 00:06:17.617 05:31:28 -- common/autotest_common.sh@940 -- # kill -0 2197837 00:06:17.617 05:31:28 -- common/autotest_common.sh@941 -- # uname 00:06:17.617 05:31:28 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:17.617 05:31:28 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2197837 00:06:17.617 05:31:28 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:17.617 05:31:28 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:17.617 05:31:28 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2197837' 00:06:17.617 killing process with pid 2197837 00:06:17.617 05:31:28 -- common/autotest_common.sh@955 -- # kill 2197837 00:06:17.617 05:31:28 -- common/autotest_common.sh@960 -- # wait 2197837 00:06:17.875 00:06:17.875 real 0m1.639s 00:06:17.875 user 0m1.722s 00:06:17.875 sys 0m0.578s 00:06:17.875 05:31:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:17.875 05:31:29 -- common/autotest_common.sh@10 -- # set +x 00:06:17.875 ************************************ 00:06:17.875 END TEST default_locks_via_rpc 00:06:17.875 ************************************ 00:06:18.133 05:31:29 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:18.133 05:31:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:18.133 05:31:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.133 05:31:29 -- common/autotest_common.sh@10 -- # set +x 00:06:18.133 ************************************ 00:06:18.133 START TEST non_locking_app_on_locked_coremask 00:06:18.134 ************************************ 00:06:18.134 05:31:29 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:06:18.134 05:31:29 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=2198143 00:06:18.134 05:31:29 -- event/cpu_locks.sh@81 -- # waitforlisten 2198143 /var/tmp/spdk.sock 00:06:18.134 05:31:29 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:18.134 05:31:29 -- common/autotest_common.sh@829 -- # '[' -z 2198143 ']' 00:06:18.134 05:31:29 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:18.134 05:31:29 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:18.134 05:31:29 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:18.134 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:18.134 05:31:29 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:18.134 05:31:29 -- common/autotest_common.sh@10 -- # set +x 00:06:18.134 [2024-11-29 05:31:29.223876] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:18.134 [2024-11-29 05:31:29.223968] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2198143 ] 00:06:18.134 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.134 [2024-11-29 05:31:29.290350] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.134 [2024-11-29 05:31:29.328316] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:18.134 [2024-11-29 05:31:29.328438] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.067 05:31:30 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:19.067 05:31:30 -- common/autotest_common.sh@862 -- # return 0 00:06:19.067 05:31:30 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=2198183 00:06:19.067 05:31:30 -- event/cpu_locks.sh@85 -- # waitforlisten 2198183 /var/tmp/spdk2.sock 00:06:19.067 05:31:30 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:19.067 05:31:30 -- common/autotest_common.sh@829 -- # '[' -z 2198183 ']' 00:06:19.067 05:31:30 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:19.067 05:31:30 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:19.067 05:31:30 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:19.067 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:19.067 05:31:30 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:19.067 05:31:30 -- common/autotest_common.sh@10 -- # set +x 00:06:19.067 [2024-11-29 05:31:30.077969] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:19.067 [2024-11-29 05:31:30.078053] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2198183 ] 00:06:19.067 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.067 [2024-11-29 05:31:30.174321] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:19.067 [2024-11-29 05:31:30.174354] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.067 [2024-11-29 05:31:30.248617] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:19.067 [2024-11-29 05:31:30.248736] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.634 05:31:30 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:19.634 05:31:30 -- common/autotest_common.sh@862 -- # return 0 00:06:19.634 05:31:30 -- event/cpu_locks.sh@87 -- # locks_exist 2198143 00:06:19.634 05:31:30 -- event/cpu_locks.sh@22 -- # lslocks -p 2198143 00:06:19.634 05:31:30 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:20.569 lslocks: write error 00:06:20.569 05:31:31 -- event/cpu_locks.sh@89 -- # killprocess 2198143 00:06:20.569 05:31:31 -- common/autotest_common.sh@936 -- # '[' -z 2198143 ']' 00:06:20.569 05:31:31 -- common/autotest_common.sh@940 -- # kill -0 2198143 00:06:20.569 05:31:31 -- common/autotest_common.sh@941 -- # uname 00:06:20.569 05:31:31 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:20.569 05:31:31 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2198143 00:06:20.569 05:31:31 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:20.569 05:31:31 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:20.569 05:31:31 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2198143' 00:06:20.569 killing process with pid 2198143 00:06:20.569 05:31:31 -- common/autotest_common.sh@955 -- # kill 2198143 00:06:20.569 05:31:31 -- common/autotest_common.sh@960 -- # wait 2198143 00:06:21.137 05:31:32 -- event/cpu_locks.sh@90 -- # killprocess 2198183 00:06:21.137 05:31:32 -- common/autotest_common.sh@936 -- # '[' -z 2198183 ']' 00:06:21.137 05:31:32 -- common/autotest_common.sh@940 -- # kill -0 2198183 00:06:21.137 05:31:32 -- common/autotest_common.sh@941 -- # uname 00:06:21.137 05:31:32 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:21.137 05:31:32 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2198183 00:06:21.137 05:31:32 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:21.137 05:31:32 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:21.137 05:31:32 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2198183' 00:06:21.137 killing process with pid 2198183 00:06:21.137 05:31:32 -- common/autotest_common.sh@955 -- # kill 2198183 00:06:21.137 05:31:32 -- common/autotest_common.sh@960 -- # wait 2198183 00:06:21.396 00:06:21.396 real 0m3.490s 00:06:21.396 user 0m3.771s 00:06:21.396 sys 0m1.170s 00:06:21.396 05:31:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:21.396 05:31:32 -- common/autotest_common.sh@10 -- # set +x 00:06:21.396 ************************************ 00:06:21.396 END TEST non_locking_app_on_locked_coremask 00:06:21.396 ************************************ 00:06:21.655 05:31:32 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:21.655 05:31:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:21.655 05:31:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:21.655 05:31:32 -- common/autotest_common.sh@10 -- # set +x 00:06:21.655 ************************************ 00:06:21.655 START TEST locking_app_on_unlocked_coremask 00:06:21.655 ************************************ 00:06:21.655 05:31:32 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:06:21.655 05:31:32 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=2198731 00:06:21.655 05:31:32 -- event/cpu_locks.sh@99 -- # waitforlisten 2198731 /var/tmp/spdk.sock 00:06:21.655 05:31:32 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:21.655 05:31:32 -- common/autotest_common.sh@829 -- # '[' -z 2198731 ']' 00:06:21.655 05:31:32 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:21.655 05:31:32 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:21.655 05:31:32 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:21.655 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:21.655 05:31:32 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:21.655 05:31:32 -- common/autotest_common.sh@10 -- # set +x 00:06:21.655 [2024-11-29 05:31:32.761042] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:21.655 [2024-11-29 05:31:32.761129] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2198731 ] 00:06:21.655 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.655 [2024-11-29 05:31:32.828330] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:21.655 [2024-11-29 05:31:32.828357] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.655 [2024-11-29 05:31:32.866022] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:21.655 [2024-11-29 05:31:32.866141] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.591 05:31:33 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:22.591 05:31:33 -- common/autotest_common.sh@862 -- # return 0 00:06:22.591 05:31:33 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=2198989 00:06:22.591 05:31:33 -- event/cpu_locks.sh@103 -- # waitforlisten 2198989 /var/tmp/spdk2.sock 00:06:22.591 05:31:33 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:22.591 05:31:33 -- common/autotest_common.sh@829 -- # '[' -z 2198989 ']' 00:06:22.591 05:31:33 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:22.591 05:31:33 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:22.591 05:31:33 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:22.591 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:22.591 05:31:33 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:22.591 05:31:33 -- common/autotest_common.sh@10 -- # set +x 00:06:22.591 [2024-11-29 05:31:33.615545] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:22.591 [2024-11-29 05:31:33.615614] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2198989 ] 00:06:22.591 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.591 [2024-11-29 05:31:33.703677] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.591 [2024-11-29 05:31:33.776466] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:22.591 [2024-11-29 05:31:33.780605] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.158 05:31:34 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:23.158 05:31:34 -- common/autotest_common.sh@862 -- # return 0 00:06:23.158 05:31:34 -- event/cpu_locks.sh@105 -- # locks_exist 2198989 00:06:23.158 05:31:34 -- event/cpu_locks.sh@22 -- # lslocks -p 2198989 00:06:23.158 05:31:34 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:24.093 lslocks: write error 00:06:24.093 05:31:35 -- event/cpu_locks.sh@107 -- # killprocess 2198731 00:06:24.093 05:31:35 -- common/autotest_common.sh@936 -- # '[' -z 2198731 ']' 00:06:24.093 05:31:35 -- common/autotest_common.sh@940 -- # kill -0 2198731 00:06:24.093 05:31:35 -- common/autotest_common.sh@941 -- # uname 00:06:24.093 05:31:35 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:24.093 05:31:35 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2198731 00:06:24.093 05:31:35 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:24.093 05:31:35 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:24.093 05:31:35 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2198731' 00:06:24.093 killing process with pid 2198731 00:06:24.093 05:31:35 -- common/autotest_common.sh@955 -- # kill 2198731 00:06:24.093 05:31:35 -- common/autotest_common.sh@960 -- # wait 2198731 00:06:24.659 05:31:35 -- event/cpu_locks.sh@108 -- # killprocess 2198989 00:06:24.659 05:31:35 -- common/autotest_common.sh@936 -- # '[' -z 2198989 ']' 00:06:24.659 05:31:35 -- common/autotest_common.sh@940 -- # kill -0 2198989 00:06:24.660 05:31:35 -- common/autotest_common.sh@941 -- # uname 00:06:24.660 05:31:35 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:24.660 05:31:35 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2198989 00:06:24.660 05:31:35 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:24.660 05:31:35 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:24.660 05:31:35 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2198989' 00:06:24.660 killing process with pid 2198989 00:06:24.660 05:31:35 -- common/autotest_common.sh@955 -- # kill 2198989 00:06:24.660 05:31:35 -- common/autotest_common.sh@960 -- # wait 2198989 00:06:25.227 00:06:25.227 real 0m3.486s 00:06:25.227 user 0m3.735s 00:06:25.227 sys 0m1.118s 00:06:25.227 05:31:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:25.227 05:31:36 -- common/autotest_common.sh@10 -- # set +x 00:06:25.227 ************************************ 00:06:25.227 END TEST locking_app_on_unlocked_coremask 00:06:25.227 ************************************ 00:06:25.227 05:31:36 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:25.227 05:31:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:25.227 05:31:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:25.227 05:31:36 -- common/autotest_common.sh@10 -- # set +x 00:06:25.227 ************************************ 00:06:25.227 START TEST locking_app_on_locked_coremask 00:06:25.227 ************************************ 00:06:25.227 05:31:36 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:06:25.227 05:31:36 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=2199379 00:06:25.227 05:31:36 -- event/cpu_locks.sh@116 -- # waitforlisten 2199379 /var/tmp/spdk.sock 00:06:25.227 05:31:36 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:25.227 05:31:36 -- common/autotest_common.sh@829 -- # '[' -z 2199379 ']' 00:06:25.227 05:31:36 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.227 05:31:36 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:25.227 05:31:36 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.227 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.227 05:31:36 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:25.227 05:31:36 -- common/autotest_common.sh@10 -- # set +x 00:06:25.227 [2024-11-29 05:31:36.292121] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:25.227 [2024-11-29 05:31:36.292210] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2199379 ] 00:06:25.227 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.227 [2024-11-29 05:31:36.359605] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.227 [2024-11-29 05:31:36.397652] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:25.227 [2024-11-29 05:31:36.397772] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.998 05:31:37 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:25.998 05:31:37 -- common/autotest_common.sh@862 -- # return 0 00:06:25.998 05:31:37 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=2199581 00:06:25.998 05:31:37 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 2199581 /var/tmp/spdk2.sock 00:06:25.998 05:31:37 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:25.998 05:31:37 -- common/autotest_common.sh@650 -- # local es=0 00:06:25.998 05:31:37 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 2199581 /var/tmp/spdk2.sock 00:06:25.998 05:31:37 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:25.998 05:31:37 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:25.998 05:31:37 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:25.998 05:31:37 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:25.998 05:31:37 -- common/autotest_common.sh@653 -- # waitforlisten 2199581 /var/tmp/spdk2.sock 00:06:25.998 05:31:37 -- common/autotest_common.sh@829 -- # '[' -z 2199581 ']' 00:06:25.998 05:31:37 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:25.998 05:31:37 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:25.998 05:31:37 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:25.998 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:25.998 05:31:37 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:25.998 05:31:37 -- common/autotest_common.sh@10 -- # set +x 00:06:25.998 [2024-11-29 05:31:37.157701] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:25.998 [2024-11-29 05:31:37.157787] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2199581 ] 00:06:25.998 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.998 [2024-11-29 05:31:37.246245] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 2199379 has claimed it. 00:06:25.998 [2024-11-29 05:31:37.246278] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:26.566 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (2199581) - No such process 00:06:26.566 ERROR: process (pid: 2199581) is no longer running 00:06:26.566 05:31:37 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:26.566 05:31:37 -- common/autotest_common.sh@862 -- # return 1 00:06:26.566 05:31:37 -- common/autotest_common.sh@653 -- # es=1 00:06:26.566 05:31:37 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:26.566 05:31:37 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:26.566 05:31:37 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:26.566 05:31:37 -- event/cpu_locks.sh@122 -- # locks_exist 2199379 00:06:26.566 05:31:37 -- event/cpu_locks.sh@22 -- # lslocks -p 2199379 00:06:26.566 05:31:37 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:27.134 lslocks: write error 00:06:27.134 05:31:38 -- event/cpu_locks.sh@124 -- # killprocess 2199379 00:06:27.134 05:31:38 -- common/autotest_common.sh@936 -- # '[' -z 2199379 ']' 00:06:27.134 05:31:38 -- common/autotest_common.sh@940 -- # kill -0 2199379 00:06:27.134 05:31:38 -- common/autotest_common.sh@941 -- # uname 00:06:27.134 05:31:38 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:27.134 05:31:38 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2199379 00:06:27.134 05:31:38 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:27.134 05:31:38 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:27.134 05:31:38 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2199379' 00:06:27.134 killing process with pid 2199379 00:06:27.134 05:31:38 -- common/autotest_common.sh@955 -- # kill 2199379 00:06:27.134 05:31:38 -- common/autotest_common.sh@960 -- # wait 2199379 00:06:27.394 00:06:27.394 real 0m2.291s 00:06:27.394 user 0m2.550s 00:06:27.394 sys 0m0.631s 00:06:27.394 05:31:38 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:27.394 05:31:38 -- common/autotest_common.sh@10 -- # set +x 00:06:27.394 ************************************ 00:06:27.394 END TEST locking_app_on_locked_coremask 00:06:27.394 ************************************ 00:06:27.394 05:31:38 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:27.394 05:31:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:27.394 05:31:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:27.394 05:31:38 -- common/autotest_common.sh@10 -- # set +x 00:06:27.394 ************************************ 00:06:27.394 START TEST locking_overlapped_coremask 00:06:27.394 ************************************ 00:06:27.394 05:31:38 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:06:27.394 05:31:38 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=2199879 00:06:27.394 05:31:38 -- event/cpu_locks.sh@133 -- # waitforlisten 2199879 /var/tmp/spdk.sock 00:06:27.394 05:31:38 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:27.394 05:31:38 -- common/autotest_common.sh@829 -- # '[' -z 2199879 ']' 00:06:27.394 05:31:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:27.394 05:31:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:27.394 05:31:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:27.394 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:27.394 05:31:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:27.394 05:31:38 -- common/autotest_common.sh@10 -- # set +x 00:06:27.394 [2024-11-29 05:31:38.634720] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:27.394 [2024-11-29 05:31:38.634800] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2199879 ] 00:06:27.394 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.653 [2024-11-29 05:31:38.701816] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:27.653 [2024-11-29 05:31:38.736265] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:27.654 [2024-11-29 05:31:38.736464] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:27.654 [2024-11-29 05:31:38.736569] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:27.654 [2024-11-29 05:31:38.736572] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.223 05:31:39 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:28.223 05:31:39 -- common/autotest_common.sh@862 -- # return 0 00:06:28.223 05:31:39 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=2200045 00:06:28.223 05:31:39 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 2200045 /var/tmp/spdk2.sock 00:06:28.223 05:31:39 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:28.223 05:31:39 -- common/autotest_common.sh@650 -- # local es=0 00:06:28.223 05:31:39 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 2200045 /var/tmp/spdk2.sock 00:06:28.223 05:31:39 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:28.223 05:31:39 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:28.223 05:31:39 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:28.223 05:31:39 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:28.223 05:31:39 -- common/autotest_common.sh@653 -- # waitforlisten 2200045 /var/tmp/spdk2.sock 00:06:28.223 05:31:39 -- common/autotest_common.sh@829 -- # '[' -z 2200045 ']' 00:06:28.223 05:31:39 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:28.223 05:31:39 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:28.223 05:31:39 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:28.223 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:28.223 05:31:39 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:28.223 05:31:39 -- common/autotest_common.sh@10 -- # set +x 00:06:28.223 [2024-11-29 05:31:39.493918] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:28.223 [2024-11-29 05:31:39.493991] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2200045 ] 00:06:28.483 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.483 [2024-11-29 05:31:39.586545] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2199879 has claimed it. 00:06:28.483 [2024-11-29 05:31:39.586587] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:29.053 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (2200045) - No such process 00:06:29.053 ERROR: process (pid: 2200045) is no longer running 00:06:29.053 05:31:40 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:29.053 05:31:40 -- common/autotest_common.sh@862 -- # return 1 00:06:29.053 05:31:40 -- common/autotest_common.sh@653 -- # es=1 00:06:29.053 05:31:40 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:29.053 05:31:40 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:29.053 05:31:40 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:29.053 05:31:40 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:29.053 05:31:40 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:29.053 05:31:40 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:29.053 05:31:40 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:29.053 05:31:40 -- event/cpu_locks.sh@141 -- # killprocess 2199879 00:06:29.053 05:31:40 -- common/autotest_common.sh@936 -- # '[' -z 2199879 ']' 00:06:29.053 05:31:40 -- common/autotest_common.sh@940 -- # kill -0 2199879 00:06:29.053 05:31:40 -- common/autotest_common.sh@941 -- # uname 00:06:29.053 05:31:40 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:29.053 05:31:40 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2199879 00:06:29.053 05:31:40 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:29.053 05:31:40 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:29.053 05:31:40 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2199879' 00:06:29.053 killing process with pid 2199879 00:06:29.053 05:31:40 -- common/autotest_common.sh@955 -- # kill 2199879 00:06:29.053 05:31:40 -- common/autotest_common.sh@960 -- # wait 2199879 00:06:29.313 00:06:29.313 real 0m1.908s 00:06:29.313 user 0m5.492s 00:06:29.313 sys 0m0.469s 00:06:29.313 05:31:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:29.313 05:31:40 -- common/autotest_common.sh@10 -- # set +x 00:06:29.313 ************************************ 00:06:29.313 END TEST locking_overlapped_coremask 00:06:29.313 ************************************ 00:06:29.313 05:31:40 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:29.313 05:31:40 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:29.313 05:31:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:29.313 05:31:40 -- common/autotest_common.sh@10 -- # set +x 00:06:29.313 ************************************ 00:06:29.313 START TEST locking_overlapped_coremask_via_rpc 00:06:29.313 ************************************ 00:06:29.313 05:31:40 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:06:29.313 05:31:40 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=2200194 00:06:29.313 05:31:40 -- event/cpu_locks.sh@149 -- # waitforlisten 2200194 /var/tmp/spdk.sock 00:06:29.313 05:31:40 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:29.313 05:31:40 -- common/autotest_common.sh@829 -- # '[' -z 2200194 ']' 00:06:29.313 05:31:40 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:29.313 05:31:40 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:29.313 05:31:40 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:29.313 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:29.313 05:31:40 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:29.313 05:31:40 -- common/autotest_common.sh@10 -- # set +x 00:06:29.313 [2024-11-29 05:31:40.593096] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:29.313 [2024-11-29 05:31:40.593189] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2200194 ] 00:06:29.572 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.572 [2024-11-29 05:31:40.660106] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:29.572 [2024-11-29 05:31:40.660138] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:29.572 [2024-11-29 05:31:40.694364] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:29.572 [2024-11-29 05:31:40.694535] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:29.572 [2024-11-29 05:31:40.694632] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:29.572 [2024-11-29 05:31:40.694635] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.142 05:31:41 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:30.142 05:31:41 -- common/autotest_common.sh@862 -- # return 0 00:06:30.142 05:31:41 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:30.142 05:31:41 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=2200465 00:06:30.142 05:31:41 -- event/cpu_locks.sh@153 -- # waitforlisten 2200465 /var/tmp/spdk2.sock 00:06:30.142 05:31:41 -- common/autotest_common.sh@829 -- # '[' -z 2200465 ']' 00:06:30.142 05:31:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:30.142 05:31:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:30.142 05:31:41 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:30.142 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:30.142 05:31:41 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:30.142 05:31:41 -- common/autotest_common.sh@10 -- # set +x 00:06:30.142 [2024-11-29 05:31:41.429718] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:30.142 [2024-11-29 05:31:41.429773] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2200465 ] 00:06:30.401 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.401 [2024-11-29 05:31:41.519760] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:30.401 [2024-11-29 05:31:41.519792] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:30.401 [2024-11-29 05:31:41.595491] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:30.401 [2024-11-29 05:31:41.595669] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:30.401 [2024-11-29 05:31:41.599643] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:30.401 [2024-11-29 05:31:41.599644] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:31.340 05:31:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:31.340 05:31:42 -- common/autotest_common.sh@862 -- # return 0 00:06:31.340 05:31:42 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:31.340 05:31:42 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.340 05:31:42 -- common/autotest_common.sh@10 -- # set +x 00:06:31.340 05:31:42 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.340 05:31:42 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:31.340 05:31:42 -- common/autotest_common.sh@650 -- # local es=0 00:06:31.340 05:31:42 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:31.340 05:31:42 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:31.340 05:31:42 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:31.340 05:31:42 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:31.340 05:31:42 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:31.340 05:31:42 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:31.340 05:31:42 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.340 05:31:42 -- common/autotest_common.sh@10 -- # set +x 00:06:31.340 [2024-11-29 05:31:42.292659] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2200194 has claimed it. 00:06:31.340 request: 00:06:31.340 { 00:06:31.340 "method": "framework_enable_cpumask_locks", 00:06:31.340 "req_id": 1 00:06:31.340 } 00:06:31.340 Got JSON-RPC error response 00:06:31.340 response: 00:06:31.340 { 00:06:31.340 "code": -32603, 00:06:31.340 "message": "Failed to claim CPU core: 2" 00:06:31.340 } 00:06:31.340 05:31:42 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:31.340 05:31:42 -- common/autotest_common.sh@653 -- # es=1 00:06:31.340 05:31:42 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:31.340 05:31:42 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:31.340 05:31:42 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:31.340 05:31:42 -- event/cpu_locks.sh@158 -- # waitforlisten 2200194 /var/tmp/spdk.sock 00:06:31.340 05:31:42 -- common/autotest_common.sh@829 -- # '[' -z 2200194 ']' 00:06:31.340 05:31:42 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:31.340 05:31:42 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:31.340 05:31:42 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:31.340 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:31.340 05:31:42 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:31.340 05:31:42 -- common/autotest_common.sh@10 -- # set +x 00:06:31.340 05:31:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:31.340 05:31:42 -- common/autotest_common.sh@862 -- # return 0 00:06:31.340 05:31:42 -- event/cpu_locks.sh@159 -- # waitforlisten 2200465 /var/tmp/spdk2.sock 00:06:31.340 05:31:42 -- common/autotest_common.sh@829 -- # '[' -z 2200465 ']' 00:06:31.340 05:31:42 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:31.340 05:31:42 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:31.340 05:31:42 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:31.340 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:31.340 05:31:42 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:31.340 05:31:42 -- common/autotest_common.sh@10 -- # set +x 00:06:31.600 05:31:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:31.600 05:31:42 -- common/autotest_common.sh@862 -- # return 0 00:06:31.600 05:31:42 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:31.600 05:31:42 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:31.600 05:31:42 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:31.600 05:31:42 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:31.600 00:06:31.600 real 0m2.118s 00:06:31.600 user 0m0.887s 00:06:31.600 sys 0m0.162s 00:06:31.600 05:31:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:31.600 05:31:42 -- common/autotest_common.sh@10 -- # set +x 00:06:31.600 ************************************ 00:06:31.600 END TEST locking_overlapped_coremask_via_rpc 00:06:31.600 ************************************ 00:06:31.600 05:31:42 -- event/cpu_locks.sh@174 -- # cleanup 00:06:31.600 05:31:42 -- event/cpu_locks.sh@15 -- # [[ -z 2200194 ]] 00:06:31.600 05:31:42 -- event/cpu_locks.sh@15 -- # killprocess 2200194 00:06:31.600 05:31:42 -- common/autotest_common.sh@936 -- # '[' -z 2200194 ']' 00:06:31.600 05:31:42 -- common/autotest_common.sh@940 -- # kill -0 2200194 00:06:31.600 05:31:42 -- common/autotest_common.sh@941 -- # uname 00:06:31.600 05:31:42 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:31.600 05:31:42 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2200194 00:06:31.600 05:31:42 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:31.600 05:31:42 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:31.601 05:31:42 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2200194' 00:06:31.601 killing process with pid 2200194 00:06:31.601 05:31:42 -- common/autotest_common.sh@955 -- # kill 2200194 00:06:31.601 05:31:42 -- common/autotest_common.sh@960 -- # wait 2200194 00:06:31.860 05:31:43 -- event/cpu_locks.sh@16 -- # [[ -z 2200465 ]] 00:06:31.860 05:31:43 -- event/cpu_locks.sh@16 -- # killprocess 2200465 00:06:31.860 05:31:43 -- common/autotest_common.sh@936 -- # '[' -z 2200465 ']' 00:06:31.860 05:31:43 -- common/autotest_common.sh@940 -- # kill -0 2200465 00:06:31.860 05:31:43 -- common/autotest_common.sh@941 -- # uname 00:06:31.860 05:31:43 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:31.860 05:31:43 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2200465 00:06:31.860 05:31:43 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:31.860 05:31:43 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:31.860 05:31:43 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2200465' 00:06:31.860 killing process with pid 2200465 00:06:31.860 05:31:43 -- common/autotest_common.sh@955 -- # kill 2200465 00:06:31.860 05:31:43 -- common/autotest_common.sh@960 -- # wait 2200465 00:06:32.430 05:31:43 -- event/cpu_locks.sh@18 -- # rm -f 00:06:32.430 05:31:43 -- event/cpu_locks.sh@1 -- # cleanup 00:06:32.430 05:31:43 -- event/cpu_locks.sh@15 -- # [[ -z 2200194 ]] 00:06:32.430 05:31:43 -- event/cpu_locks.sh@15 -- # killprocess 2200194 00:06:32.430 05:31:43 -- common/autotest_common.sh@936 -- # '[' -z 2200194 ']' 00:06:32.430 05:31:43 -- common/autotest_common.sh@940 -- # kill -0 2200194 00:06:32.430 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (2200194) - No such process 00:06:32.430 05:31:43 -- common/autotest_common.sh@963 -- # echo 'Process with pid 2200194 is not found' 00:06:32.430 Process with pid 2200194 is not found 00:06:32.430 05:31:43 -- event/cpu_locks.sh@16 -- # [[ -z 2200465 ]] 00:06:32.430 05:31:43 -- event/cpu_locks.sh@16 -- # killprocess 2200465 00:06:32.430 05:31:43 -- common/autotest_common.sh@936 -- # '[' -z 2200465 ']' 00:06:32.430 05:31:43 -- common/autotest_common.sh@940 -- # kill -0 2200465 00:06:32.430 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (2200465) - No such process 00:06:32.430 05:31:43 -- common/autotest_common.sh@963 -- # echo 'Process with pid 2200465 is not found' 00:06:32.430 Process with pid 2200465 is not found 00:06:32.430 05:31:43 -- event/cpu_locks.sh@18 -- # rm -f 00:06:32.430 00:06:32.430 real 0m17.906s 00:06:32.430 user 0m30.938s 00:06:32.430 sys 0m5.678s 00:06:32.430 05:31:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:32.430 05:31:43 -- common/autotest_common.sh@10 -- # set +x 00:06:32.430 ************************************ 00:06:32.430 END TEST cpu_locks 00:06:32.430 ************************************ 00:06:32.430 00:06:32.430 real 0m43.033s 00:06:32.430 user 1m21.022s 00:06:32.430 sys 0m9.792s 00:06:32.430 05:31:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:32.430 05:31:43 -- common/autotest_common.sh@10 -- # set +x 00:06:32.430 ************************************ 00:06:32.430 END TEST event 00:06:32.430 ************************************ 00:06:32.430 05:31:43 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:32.430 05:31:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:32.430 05:31:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.430 05:31:43 -- common/autotest_common.sh@10 -- # set +x 00:06:32.430 ************************************ 00:06:32.430 START TEST thread 00:06:32.430 ************************************ 00:06:32.430 05:31:43 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:32.430 * Looking for test storage... 00:06:32.430 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:32.430 05:31:43 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:32.430 05:31:43 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:32.430 05:31:43 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:32.430 05:31:43 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:32.430 05:31:43 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:32.430 05:31:43 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:32.430 05:31:43 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:32.430 05:31:43 -- scripts/common.sh@335 -- # IFS=.-: 00:06:32.430 05:31:43 -- scripts/common.sh@335 -- # read -ra ver1 00:06:32.430 05:31:43 -- scripts/common.sh@336 -- # IFS=.-: 00:06:32.430 05:31:43 -- scripts/common.sh@336 -- # read -ra ver2 00:06:32.430 05:31:43 -- scripts/common.sh@337 -- # local 'op=<' 00:06:32.430 05:31:43 -- scripts/common.sh@339 -- # ver1_l=2 00:06:32.430 05:31:43 -- scripts/common.sh@340 -- # ver2_l=1 00:06:32.430 05:31:43 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:32.430 05:31:43 -- scripts/common.sh@343 -- # case "$op" in 00:06:32.430 05:31:43 -- scripts/common.sh@344 -- # : 1 00:06:32.430 05:31:43 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:32.430 05:31:43 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:32.689 05:31:43 -- scripts/common.sh@364 -- # decimal 1 00:06:32.689 05:31:43 -- scripts/common.sh@352 -- # local d=1 00:06:32.689 05:31:43 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:32.689 05:31:43 -- scripts/common.sh@354 -- # echo 1 00:06:32.689 05:31:43 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:32.689 05:31:43 -- scripts/common.sh@365 -- # decimal 2 00:06:32.689 05:31:43 -- scripts/common.sh@352 -- # local d=2 00:06:32.689 05:31:43 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:32.689 05:31:43 -- scripts/common.sh@354 -- # echo 2 00:06:32.689 05:31:43 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:32.689 05:31:43 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:32.689 05:31:43 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:32.689 05:31:43 -- scripts/common.sh@367 -- # return 0 00:06:32.689 05:31:43 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:32.689 05:31:43 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:32.689 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.689 --rc genhtml_branch_coverage=1 00:06:32.689 --rc genhtml_function_coverage=1 00:06:32.689 --rc genhtml_legend=1 00:06:32.689 --rc geninfo_all_blocks=1 00:06:32.689 --rc geninfo_unexecuted_blocks=1 00:06:32.689 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.689 ' 00:06:32.689 05:31:43 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:32.689 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.689 --rc genhtml_branch_coverage=1 00:06:32.689 --rc genhtml_function_coverage=1 00:06:32.689 --rc genhtml_legend=1 00:06:32.689 --rc geninfo_all_blocks=1 00:06:32.689 --rc geninfo_unexecuted_blocks=1 00:06:32.689 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.689 ' 00:06:32.689 05:31:43 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:32.689 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.689 --rc genhtml_branch_coverage=1 00:06:32.689 --rc genhtml_function_coverage=1 00:06:32.689 --rc genhtml_legend=1 00:06:32.689 --rc geninfo_all_blocks=1 00:06:32.689 --rc geninfo_unexecuted_blocks=1 00:06:32.689 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.689 ' 00:06:32.689 05:31:43 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:32.689 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.689 --rc genhtml_branch_coverage=1 00:06:32.689 --rc genhtml_function_coverage=1 00:06:32.689 --rc genhtml_legend=1 00:06:32.689 --rc geninfo_all_blocks=1 00:06:32.689 --rc geninfo_unexecuted_blocks=1 00:06:32.689 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.689 ' 00:06:32.689 05:31:43 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:32.689 05:31:43 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:32.689 05:31:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.689 05:31:43 -- common/autotest_common.sh@10 -- # set +x 00:06:32.689 ************************************ 00:06:32.689 START TEST thread_poller_perf 00:06:32.689 ************************************ 00:06:32.689 05:31:43 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:32.689 [2024-11-29 05:31:43.766054] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:32.689 [2024-11-29 05:31:43.766160] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2200848 ] 00:06:32.689 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.689 [2024-11-29 05:31:43.835302] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.689 [2024-11-29 05:31:43.871570] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.689 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:34.067 [2024-11-29T04:31:45.368Z] ====================================== 00:06:34.067 [2024-11-29T04:31:45.368Z] busy:2505050326 (cyc) 00:06:34.067 [2024-11-29T04:31:45.368Z] total_run_count: 785000 00:06:34.067 [2024-11-29T04:31:45.368Z] tsc_hz: 2500000000 (cyc) 00:06:34.067 [2024-11-29T04:31:45.368Z] ====================================== 00:06:34.067 [2024-11-29T04:31:45.368Z] poller_cost: 3191 (cyc), 1276 (nsec) 00:06:34.067 00:06:34.067 real 0m1.181s 00:06:34.067 user 0m1.087s 00:06:34.067 sys 0m0.089s 00:06:34.067 05:31:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:34.067 05:31:44 -- common/autotest_common.sh@10 -- # set +x 00:06:34.067 ************************************ 00:06:34.067 END TEST thread_poller_perf 00:06:34.067 ************************************ 00:06:34.067 05:31:44 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:34.067 05:31:44 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:34.067 05:31:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:34.067 05:31:44 -- common/autotest_common.sh@10 -- # set +x 00:06:34.067 ************************************ 00:06:34.067 START TEST thread_poller_perf 00:06:34.067 ************************************ 00:06:34.067 05:31:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:34.067 [2024-11-29 05:31:45.002155] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:34.067 [2024-11-29 05:31:45.002246] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2201126 ] 00:06:34.067 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.067 [2024-11-29 05:31:45.072026] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.067 [2024-11-29 05:31:45.106450] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.067 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:35.003 [2024-11-29T04:31:46.304Z] ====================================== 00:06:35.003 [2024-11-29T04:31:46.304Z] busy:2501998070 (cyc) 00:06:35.003 [2024-11-29T04:31:46.304Z] total_run_count: 13091000 00:06:35.003 [2024-11-29T04:31:46.304Z] tsc_hz: 2500000000 (cyc) 00:06:35.003 [2024-11-29T04:31:46.304Z] ====================================== 00:06:35.003 [2024-11-29T04:31:46.304Z] poller_cost: 191 (cyc), 76 (nsec) 00:06:35.003 00:06:35.003 real 0m1.177s 00:06:35.003 user 0m1.087s 00:06:35.003 sys 0m0.085s 00:06:35.003 05:31:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:35.003 05:31:46 -- common/autotest_common.sh@10 -- # set +x 00:06:35.003 ************************************ 00:06:35.003 END TEST thread_poller_perf 00:06:35.003 ************************************ 00:06:35.003 05:31:46 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:35.003 05:31:46 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:35.003 05:31:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:35.003 05:31:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.003 05:31:46 -- common/autotest_common.sh@10 -- # set +x 00:06:35.003 ************************************ 00:06:35.003 START TEST thread_spdk_lock 00:06:35.003 ************************************ 00:06:35.003 05:31:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:35.004 [2024-11-29 05:31:46.229535] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:35.004 [2024-11-29 05:31:46.229637] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2201414 ] 00:06:35.004 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.004 [2024-11-29 05:31:46.299754] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:35.262 [2024-11-29 05:31:46.336609] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.262 [2024-11-29 05:31:46.336610] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:35.521 [2024-11-29 05:31:46.821077] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.521 [2024-11-29 05:31:46.821116] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:35.521 [2024-11-29 05:31:46.821127] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x12e2e40 00:06:35.521 [2024-11-29 05:31:46.821920] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.521 [2024-11-29 05:31:46.822025] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.521 [2024-11-29 05:31:46.822044] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:35.780 Starting test contend 00:06:35.780 Worker Delay Wait us Hold us Total us 00:06:35.780 0 3 168815 180186 349002 00:06:35.780 1 5 87913 282003 369917 00:06:35.780 PASS test contend 00:06:35.780 Starting test hold_by_poller 00:06:35.780 PASS test hold_by_poller 00:06:35.780 Starting test hold_by_message 00:06:35.780 PASS test hold_by_message 00:06:35.780 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:35.780 100014 assertions passed 00:06:35.780 0 assertions failed 00:06:35.780 00:06:35.780 real 0m0.663s 00:06:35.780 user 0m1.056s 00:06:35.780 sys 0m0.089s 00:06:35.780 05:31:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:35.780 05:31:46 -- common/autotest_common.sh@10 -- # set +x 00:06:35.780 ************************************ 00:06:35.780 END TEST thread_spdk_lock 00:06:35.780 ************************************ 00:06:35.780 00:06:35.780 real 0m3.361s 00:06:35.780 user 0m3.382s 00:06:35.780 sys 0m0.494s 00:06:35.780 05:31:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:35.780 05:31:46 -- common/autotest_common.sh@10 -- # set +x 00:06:35.780 ************************************ 00:06:35.780 END TEST thread 00:06:35.780 ************************************ 00:06:35.780 05:31:46 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:35.780 05:31:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:35.780 05:31:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.780 05:31:46 -- common/autotest_common.sh@10 -- # set +x 00:06:35.780 ************************************ 00:06:35.780 START TEST accel 00:06:35.780 ************************************ 00:06:35.780 05:31:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:35.780 * Looking for test storage... 00:06:35.780 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:35.780 05:31:47 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:35.780 05:31:47 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:35.780 05:31:47 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:36.040 05:31:47 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:36.040 05:31:47 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:36.040 05:31:47 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:36.040 05:31:47 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:36.040 05:31:47 -- scripts/common.sh@335 -- # IFS=.-: 00:06:36.040 05:31:47 -- scripts/common.sh@335 -- # read -ra ver1 00:06:36.040 05:31:47 -- scripts/common.sh@336 -- # IFS=.-: 00:06:36.040 05:31:47 -- scripts/common.sh@336 -- # read -ra ver2 00:06:36.040 05:31:47 -- scripts/common.sh@337 -- # local 'op=<' 00:06:36.040 05:31:47 -- scripts/common.sh@339 -- # ver1_l=2 00:06:36.040 05:31:47 -- scripts/common.sh@340 -- # ver2_l=1 00:06:36.040 05:31:47 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:36.040 05:31:47 -- scripts/common.sh@343 -- # case "$op" in 00:06:36.040 05:31:47 -- scripts/common.sh@344 -- # : 1 00:06:36.040 05:31:47 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:36.040 05:31:47 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:36.040 05:31:47 -- scripts/common.sh@364 -- # decimal 1 00:06:36.040 05:31:47 -- scripts/common.sh@352 -- # local d=1 00:06:36.040 05:31:47 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:36.040 05:31:47 -- scripts/common.sh@354 -- # echo 1 00:06:36.040 05:31:47 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:36.040 05:31:47 -- scripts/common.sh@365 -- # decimal 2 00:06:36.040 05:31:47 -- scripts/common.sh@352 -- # local d=2 00:06:36.040 05:31:47 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:36.040 05:31:47 -- scripts/common.sh@354 -- # echo 2 00:06:36.040 05:31:47 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:36.040 05:31:47 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:36.040 05:31:47 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:36.040 05:31:47 -- scripts/common.sh@367 -- # return 0 00:06:36.040 05:31:47 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:36.040 05:31:47 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:36.040 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.040 --rc genhtml_branch_coverage=1 00:06:36.040 --rc genhtml_function_coverage=1 00:06:36.040 --rc genhtml_legend=1 00:06:36.040 --rc geninfo_all_blocks=1 00:06:36.040 --rc geninfo_unexecuted_blocks=1 00:06:36.040 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.040 ' 00:06:36.040 05:31:47 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:36.040 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.040 --rc genhtml_branch_coverage=1 00:06:36.040 --rc genhtml_function_coverage=1 00:06:36.040 --rc genhtml_legend=1 00:06:36.040 --rc geninfo_all_blocks=1 00:06:36.040 --rc geninfo_unexecuted_blocks=1 00:06:36.040 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.040 ' 00:06:36.040 05:31:47 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:36.040 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.040 --rc genhtml_branch_coverage=1 00:06:36.040 --rc genhtml_function_coverage=1 00:06:36.040 --rc genhtml_legend=1 00:06:36.040 --rc geninfo_all_blocks=1 00:06:36.040 --rc geninfo_unexecuted_blocks=1 00:06:36.040 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.040 ' 00:06:36.040 05:31:47 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:36.040 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.040 --rc genhtml_branch_coverage=1 00:06:36.040 --rc genhtml_function_coverage=1 00:06:36.040 --rc genhtml_legend=1 00:06:36.040 --rc geninfo_all_blocks=1 00:06:36.040 --rc geninfo_unexecuted_blocks=1 00:06:36.040 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.040 ' 00:06:36.040 05:31:47 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:36.040 05:31:47 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:36.040 05:31:47 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:36.040 05:31:47 -- accel/accel.sh@59 -- # spdk_tgt_pid=2201688 00:06:36.040 05:31:47 -- accel/accel.sh@60 -- # waitforlisten 2201688 00:06:36.040 05:31:47 -- common/autotest_common.sh@829 -- # '[' -z 2201688 ']' 00:06:36.040 05:31:47 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:36.040 05:31:47 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:36.040 05:31:47 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:36.040 05:31:47 -- accel/accel.sh@58 -- # build_accel_config 00:06:36.040 05:31:47 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:36.040 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:36.040 05:31:47 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:36.040 05:31:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.040 05:31:47 -- common/autotest_common.sh@10 -- # set +x 00:06:36.040 05:31:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.040 05:31:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.040 05:31:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.040 05:31:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.040 05:31:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.040 05:31:47 -- accel/accel.sh@42 -- # jq -r . 00:06:36.040 [2024-11-29 05:31:47.168562] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:36.040 [2024-11-29 05:31:47.168644] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2201688 ] 00:06:36.040 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.040 [2024-11-29 05:31:47.235638] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.040 [2024-11-29 05:31:47.273424] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:36.040 [2024-11-29 05:31:47.273543] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.980 05:31:47 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:36.980 05:31:47 -- common/autotest_common.sh@862 -- # return 0 00:06:36.980 05:31:48 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:36.980 05:31:48 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:36.980 05:31:48 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:36.980 05:31:48 -- common/autotest_common.sh@10 -- # set +x 00:06:36.980 05:31:48 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:36.980 05:31:48 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # IFS== 00:06:36.980 05:31:48 -- accel/accel.sh@64 -- # read -r opc module 00:06:36.980 05:31:48 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:36.980 05:31:48 -- accel/accel.sh@67 -- # killprocess 2201688 00:06:36.980 05:31:48 -- common/autotest_common.sh@936 -- # '[' -z 2201688 ']' 00:06:36.980 05:31:48 -- common/autotest_common.sh@940 -- # kill -0 2201688 00:06:36.980 05:31:48 -- common/autotest_common.sh@941 -- # uname 00:06:36.980 05:31:48 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:36.980 05:31:48 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2201688 00:06:36.980 05:31:48 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:36.980 05:31:48 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:36.980 05:31:48 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2201688' 00:06:36.980 killing process with pid 2201688 00:06:36.980 05:31:48 -- common/autotest_common.sh@955 -- # kill 2201688 00:06:36.980 05:31:48 -- common/autotest_common.sh@960 -- # wait 2201688 00:06:37.240 05:31:48 -- accel/accel.sh@68 -- # trap - ERR 00:06:37.240 05:31:48 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:37.240 05:31:48 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:37.240 05:31:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:37.240 05:31:48 -- common/autotest_common.sh@10 -- # set +x 00:06:37.240 05:31:48 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:06:37.240 05:31:48 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.240 05:31:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.240 05:31:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:37.240 05:31:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.240 05:31:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.240 05:31:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.240 05:31:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.240 05:31:48 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.240 05:31:48 -- accel/accel.sh@42 -- # jq -r . 00:06:37.240 05:31:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:37.240 05:31:48 -- common/autotest_common.sh@10 -- # set +x 00:06:37.240 05:31:48 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:37.240 05:31:48 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:37.240 05:31:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:37.240 05:31:48 -- common/autotest_common.sh@10 -- # set +x 00:06:37.240 ************************************ 00:06:37.240 START TEST accel_missing_filename 00:06:37.240 ************************************ 00:06:37.240 05:31:48 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:06:37.240 05:31:48 -- common/autotest_common.sh@650 -- # local es=0 00:06:37.241 05:31:48 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:37.241 05:31:48 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:37.241 05:31:48 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:37.241 05:31:48 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:37.241 05:31:48 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:37.241 05:31:48 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:06:37.241 05:31:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:37.241 05:31:48 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.241 05:31:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.241 05:31:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.241 05:31:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.241 05:31:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.241 05:31:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.241 05:31:48 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.241 05:31:48 -- accel/accel.sh@42 -- # jq -r . 00:06:37.241 [2024-11-29 05:31:48.481234] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:37.241 [2024-11-29 05:31:48.481349] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2201884 ] 00:06:37.241 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.500 [2024-11-29 05:31:48.551965] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.500 [2024-11-29 05:31:48.587874] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.500 [2024-11-29 05:31:48.627776] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:37.500 [2024-11-29 05:31:48.688039] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:37.500 A filename is required. 00:06:37.500 05:31:48 -- common/autotest_common.sh@653 -- # es=234 00:06:37.500 05:31:48 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:37.500 05:31:48 -- common/autotest_common.sh@662 -- # es=106 00:06:37.500 05:31:48 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:37.500 05:31:48 -- common/autotest_common.sh@670 -- # es=1 00:06:37.500 05:31:48 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:37.500 00:06:37.500 real 0m0.291s 00:06:37.500 user 0m0.194s 00:06:37.500 sys 0m0.135s 00:06:37.500 05:31:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:37.500 05:31:48 -- common/autotest_common.sh@10 -- # set +x 00:06:37.500 ************************************ 00:06:37.500 END TEST accel_missing_filename 00:06:37.500 ************************************ 00:06:37.500 05:31:48 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.500 05:31:48 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:37.500 05:31:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:37.500 05:31:48 -- common/autotest_common.sh@10 -- # set +x 00:06:37.500 ************************************ 00:06:37.500 START TEST accel_compress_verify 00:06:37.500 ************************************ 00:06:37.500 05:31:48 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.500 05:31:48 -- common/autotest_common.sh@650 -- # local es=0 00:06:37.500 05:31:48 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.500 05:31:48 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:37.500 05:31:48 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:37.500 05:31:48 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:37.500 05:31:48 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:37.501 05:31:48 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.501 05:31:48 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.501 05:31:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:37.501 05:31:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.501 05:31:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.501 05:31:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.501 05:31:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.501 05:31:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.501 05:31:48 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.501 05:31:48 -- accel/accel.sh@42 -- # jq -r . 00:06:37.760 [2024-11-29 05:31:48.810206] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:37.760 [2024-11-29 05:31:48.810296] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2202062 ] 00:06:37.760 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.760 [2024-11-29 05:31:48.878688] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.760 [2024-11-29 05:31:48.914484] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.760 [2024-11-29 05:31:48.954256] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:37.760 [2024-11-29 05:31:49.013073] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:38.020 00:06:38.020 Compression does not support the verify option, aborting. 00:06:38.020 05:31:49 -- common/autotest_common.sh@653 -- # es=161 00:06:38.020 05:31:49 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:38.020 05:31:49 -- common/autotest_common.sh@662 -- # es=33 00:06:38.020 05:31:49 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:38.020 05:31:49 -- common/autotest_common.sh@670 -- # es=1 00:06:38.020 05:31:49 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:38.020 00:06:38.020 real 0m0.287s 00:06:38.020 user 0m0.186s 00:06:38.020 sys 0m0.125s 00:06:38.020 05:31:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:38.020 05:31:49 -- common/autotest_common.sh@10 -- # set +x 00:06:38.020 ************************************ 00:06:38.020 END TEST accel_compress_verify 00:06:38.020 ************************************ 00:06:38.020 05:31:49 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:38.020 05:31:49 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:38.020 05:31:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:38.020 05:31:49 -- common/autotest_common.sh@10 -- # set +x 00:06:38.020 ************************************ 00:06:38.020 START TEST accel_wrong_workload 00:06:38.020 ************************************ 00:06:38.020 05:31:49 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:06:38.020 05:31:49 -- common/autotest_common.sh@650 -- # local es=0 00:06:38.020 05:31:49 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:38.020 05:31:49 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:38.020 05:31:49 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:38.020 05:31:49 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:38.020 05:31:49 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:38.020 05:31:49 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:06:38.020 05:31:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:38.020 05:31:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.020 05:31:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.020 05:31:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.020 05:31:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.021 05:31:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.021 05:31:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.021 05:31:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.021 05:31:49 -- accel/accel.sh@42 -- # jq -r . 00:06:38.021 Unsupported workload type: foobar 00:06:38.021 [2024-11-29 05:31:49.134615] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:38.021 accel_perf options: 00:06:38.021 [-h help message] 00:06:38.021 [-q queue depth per core] 00:06:38.021 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:38.021 [-T number of threads per core 00:06:38.021 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:38.021 [-t time in seconds] 00:06:38.021 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:38.021 [ dif_verify, , dif_generate, dif_generate_copy 00:06:38.021 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:38.021 [-l for compress/decompress workloads, name of uncompressed input file 00:06:38.021 [-S for crc32c workload, use this seed value (default 0) 00:06:38.021 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:38.021 [-f for fill workload, use this BYTE value (default 255) 00:06:38.021 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:38.021 [-y verify result if this switch is on] 00:06:38.021 [-a tasks to allocate per core (default: same value as -q)] 00:06:38.021 Can be used to spread operations across a wider range of memory. 00:06:38.021 05:31:49 -- common/autotest_common.sh@653 -- # es=1 00:06:38.021 05:31:49 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:38.021 05:31:49 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:38.021 05:31:49 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:38.021 00:06:38.021 real 0m0.027s 00:06:38.021 user 0m0.011s 00:06:38.021 sys 0m0.016s 00:06:38.021 05:31:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:38.021 05:31:49 -- common/autotest_common.sh@10 -- # set +x 00:06:38.021 ************************************ 00:06:38.021 END TEST accel_wrong_workload 00:06:38.021 ************************************ 00:06:38.021 Error: writing output failed: Broken pipe 00:06:38.021 05:31:49 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:38.021 05:31:49 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:38.021 05:31:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:38.021 05:31:49 -- common/autotest_common.sh@10 -- # set +x 00:06:38.021 ************************************ 00:06:38.021 START TEST accel_negative_buffers 00:06:38.021 ************************************ 00:06:38.021 05:31:49 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:38.021 05:31:49 -- common/autotest_common.sh@650 -- # local es=0 00:06:38.021 05:31:49 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:38.021 05:31:49 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:38.021 05:31:49 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:38.021 05:31:49 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:38.021 05:31:49 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:38.021 05:31:49 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:06:38.021 05:31:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:38.021 05:31:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.021 05:31:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.021 05:31:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.021 05:31:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.021 05:31:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.021 05:31:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.021 05:31:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.021 05:31:49 -- accel/accel.sh@42 -- # jq -r . 00:06:38.021 -x option must be non-negative. 00:06:38.021 [2024-11-29 05:31:49.200448] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:38.021 accel_perf options: 00:06:38.021 [-h help message] 00:06:38.021 [-q queue depth per core] 00:06:38.021 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:38.021 [-T number of threads per core 00:06:38.021 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:38.021 [-t time in seconds] 00:06:38.021 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:38.021 [ dif_verify, , dif_generate, dif_generate_copy 00:06:38.021 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:38.021 [-l for compress/decompress workloads, name of uncompressed input file 00:06:38.021 [-S for crc32c workload, use this seed value (default 0) 00:06:38.021 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:38.021 [-f for fill workload, use this BYTE value (default 255) 00:06:38.021 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:38.021 [-y verify result if this switch is on] 00:06:38.021 [-a tasks to allocate per core (default: same value as -q)] 00:06:38.021 Can be used to spread operations across a wider range of memory. 00:06:38.021 05:31:49 -- common/autotest_common.sh@653 -- # es=1 00:06:38.021 05:31:49 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:38.021 05:31:49 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:38.021 05:31:49 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:38.021 00:06:38.021 real 0m0.028s 00:06:38.021 user 0m0.016s 00:06:38.021 sys 0m0.012s 00:06:38.021 05:31:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:38.021 05:31:49 -- common/autotest_common.sh@10 -- # set +x 00:06:38.021 ************************************ 00:06:38.021 END TEST accel_negative_buffers 00:06:38.021 ************************************ 00:06:38.021 Error: writing output failed: Broken pipe 00:06:38.021 05:31:49 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:38.021 05:31:49 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:38.021 05:31:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:38.021 05:31:49 -- common/autotest_common.sh@10 -- # set +x 00:06:38.021 ************************************ 00:06:38.021 START TEST accel_crc32c 00:06:38.021 ************************************ 00:06:38.021 05:31:49 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:38.021 05:31:49 -- accel/accel.sh@16 -- # local accel_opc 00:06:38.021 05:31:49 -- accel/accel.sh@17 -- # local accel_module 00:06:38.021 05:31:49 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:38.021 05:31:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:38.021 05:31:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.021 05:31:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.021 05:31:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.021 05:31:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.021 05:31:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.021 05:31:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.021 05:31:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.021 05:31:49 -- accel/accel.sh@42 -- # jq -r . 00:06:38.021 [2024-11-29 05:31:49.260674] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:38.021 [2024-11-29 05:31:49.260715] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2202130 ] 00:06:38.021 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.281 [2024-11-29 05:31:49.323473] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.281 [2024-11-29 05:31:49.359815] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.662 05:31:50 -- accel/accel.sh@18 -- # out=' 00:06:39.662 SPDK Configuration: 00:06:39.662 Core mask: 0x1 00:06:39.662 00:06:39.662 Accel Perf Configuration: 00:06:39.662 Workload Type: crc32c 00:06:39.662 CRC-32C seed: 32 00:06:39.662 Transfer size: 4096 bytes 00:06:39.662 Vector count 1 00:06:39.662 Module: software 00:06:39.662 Queue depth: 32 00:06:39.662 Allocate depth: 32 00:06:39.662 # threads/core: 1 00:06:39.662 Run time: 1 seconds 00:06:39.662 Verify: Yes 00:06:39.662 00:06:39.662 Running for 1 seconds... 00:06:39.662 00:06:39.662 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:39.662 ------------------------------------------------------------------------------------ 00:06:39.662 0,0 831456/s 3247 MiB/s 0 0 00:06:39.662 ==================================================================================== 00:06:39.662 Total 831456/s 3247 MiB/s 0 0' 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:39.662 05:31:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.662 05:31:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.662 05:31:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.662 05:31:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:39.662 05:31:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.662 05:31:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.662 05:31:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.662 05:31:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.662 05:31:50 -- accel/accel.sh@42 -- # jq -r . 00:06:39.662 [2024-11-29 05:31:50.542694] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:39.662 [2024-11-29 05:31:50.542784] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2202399 ] 00:06:39.662 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.662 [2024-11-29 05:31:50.610122] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.662 [2024-11-29 05:31:50.644559] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val= 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val= 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val=0x1 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val= 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val= 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val=crc32c 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val=32 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val= 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val=software 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@23 -- # accel_module=software 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val=32 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val=32 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.662 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.662 05:31:50 -- accel/accel.sh@21 -- # val=1 00:06:39.662 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.663 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.663 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.663 05:31:50 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:39.663 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.663 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.663 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.663 05:31:50 -- accel/accel.sh@21 -- # val=Yes 00:06:39.663 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.663 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.663 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.663 05:31:50 -- accel/accel.sh@21 -- # val= 00:06:39.663 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.663 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.663 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:39.663 05:31:50 -- accel/accel.sh@21 -- # val= 00:06:39.663 05:31:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.663 05:31:50 -- accel/accel.sh@20 -- # IFS=: 00:06:39.663 05:31:50 -- accel/accel.sh@20 -- # read -r var val 00:06:40.598 05:31:51 -- accel/accel.sh@21 -- # val= 00:06:40.598 05:31:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # IFS=: 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # read -r var val 00:06:40.598 05:31:51 -- accel/accel.sh@21 -- # val= 00:06:40.598 05:31:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # IFS=: 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # read -r var val 00:06:40.598 05:31:51 -- accel/accel.sh@21 -- # val= 00:06:40.598 05:31:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # IFS=: 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # read -r var val 00:06:40.598 05:31:51 -- accel/accel.sh@21 -- # val= 00:06:40.598 05:31:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # IFS=: 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # read -r var val 00:06:40.598 05:31:51 -- accel/accel.sh@21 -- # val= 00:06:40.598 05:31:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # IFS=: 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # read -r var val 00:06:40.598 05:31:51 -- accel/accel.sh@21 -- # val= 00:06:40.598 05:31:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # IFS=: 00:06:40.598 05:31:51 -- accel/accel.sh@20 -- # read -r var val 00:06:40.598 05:31:51 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:40.598 05:31:51 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:40.598 05:31:51 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:40.598 00:06:40.598 real 0m2.562s 00:06:40.598 user 0m2.331s 00:06:40.598 sys 0m0.241s 00:06:40.598 05:31:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:40.598 05:31:51 -- common/autotest_common.sh@10 -- # set +x 00:06:40.598 ************************************ 00:06:40.598 END TEST accel_crc32c 00:06:40.598 ************************************ 00:06:40.598 05:31:51 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:40.598 05:31:51 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:40.598 05:31:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:40.598 05:31:51 -- common/autotest_common.sh@10 -- # set +x 00:06:40.598 ************************************ 00:06:40.598 START TEST accel_crc32c_C2 00:06:40.598 ************************************ 00:06:40.598 05:31:51 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:40.598 05:31:51 -- accel/accel.sh@16 -- # local accel_opc 00:06:40.598 05:31:51 -- accel/accel.sh@17 -- # local accel_module 00:06:40.598 05:31:51 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:40.598 05:31:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:40.598 05:31:51 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.598 05:31:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.598 05:31:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.598 05:31:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.598 05:31:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.598 05:31:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.598 05:31:51 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.598 05:31:51 -- accel/accel.sh@42 -- # jq -r . 00:06:40.598 [2024-11-29 05:31:51.878329] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:40.598 [2024-11-29 05:31:51.878422] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2202680 ] 00:06:40.857 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.857 [2024-11-29 05:31:51.946578] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.857 [2024-11-29 05:31:51.982465] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.234 05:31:53 -- accel/accel.sh@18 -- # out=' 00:06:42.234 SPDK Configuration: 00:06:42.234 Core mask: 0x1 00:06:42.234 00:06:42.234 Accel Perf Configuration: 00:06:42.234 Workload Type: crc32c 00:06:42.234 CRC-32C seed: 0 00:06:42.234 Transfer size: 4096 bytes 00:06:42.234 Vector count 2 00:06:42.234 Module: software 00:06:42.234 Queue depth: 32 00:06:42.234 Allocate depth: 32 00:06:42.234 # threads/core: 1 00:06:42.234 Run time: 1 seconds 00:06:42.234 Verify: Yes 00:06:42.234 00:06:42.234 Running for 1 seconds... 00:06:42.234 00:06:42.234 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:42.234 ------------------------------------------------------------------------------------ 00:06:42.234 0,0 616448/s 4816 MiB/s 0 0 00:06:42.234 ==================================================================================== 00:06:42.234 Total 616448/s 2408 MiB/s 0 0' 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:42.234 05:31:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.234 05:31:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.234 05:31:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:42.234 05:31:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.234 05:31:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.234 05:31:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.234 05:31:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.234 05:31:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.234 05:31:53 -- accel/accel.sh@42 -- # jq -r . 00:06:42.234 [2024-11-29 05:31:53.165297] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:42.234 [2024-11-29 05:31:53.165389] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2202832 ] 00:06:42.234 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.234 [2024-11-29 05:31:53.232359] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.234 [2024-11-29 05:31:53.267052] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val=0x1 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val=crc32c 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val=0 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val=software 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@23 -- # accel_module=software 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val=32 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val=32 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val=1 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val=Yes 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.234 05:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.234 05:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.234 05:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:43.169 05:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.169 05:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.169 05:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.169 05:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.169 05:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.169 05:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.169 05:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.169 05:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.169 05:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.169 05:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.169 05:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.169 05:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.169 05:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.169 05:31:54 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:43.169 05:31:54 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:43.169 05:31:54 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:43.169 00:06:43.169 real 0m2.578s 00:06:43.169 user 0m2.339s 00:06:43.169 sys 0m0.249s 00:06:43.169 05:31:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:43.169 05:31:54 -- common/autotest_common.sh@10 -- # set +x 00:06:43.169 ************************************ 00:06:43.169 END TEST accel_crc32c_C2 00:06:43.169 ************************************ 00:06:43.428 05:31:54 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:43.428 05:31:54 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:43.428 05:31:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:43.428 05:31:54 -- common/autotest_common.sh@10 -- # set +x 00:06:43.428 ************************************ 00:06:43.428 START TEST accel_copy 00:06:43.428 ************************************ 00:06:43.428 05:31:54 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:06:43.428 05:31:54 -- accel/accel.sh@16 -- # local accel_opc 00:06:43.428 05:31:54 -- accel/accel.sh@17 -- # local accel_module 00:06:43.428 05:31:54 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:43.428 05:31:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:43.428 05:31:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.428 05:31:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.428 05:31:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.428 05:31:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.428 05:31:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.428 05:31:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.428 05:31:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.428 05:31:54 -- accel/accel.sh@42 -- # jq -r . 00:06:43.428 [2024-11-29 05:31:54.488585] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:43.428 [2024-11-29 05:31:54.488655] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2203015 ] 00:06:43.428 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.428 [2024-11-29 05:31:54.552604] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.428 [2024-11-29 05:31:54.588150] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.817 05:31:55 -- accel/accel.sh@18 -- # out=' 00:06:44.817 SPDK Configuration: 00:06:44.817 Core mask: 0x1 00:06:44.817 00:06:44.817 Accel Perf Configuration: 00:06:44.817 Workload Type: copy 00:06:44.817 Transfer size: 4096 bytes 00:06:44.817 Vector count 1 00:06:44.817 Module: software 00:06:44.817 Queue depth: 32 00:06:44.817 Allocate depth: 32 00:06:44.817 # threads/core: 1 00:06:44.817 Run time: 1 seconds 00:06:44.817 Verify: Yes 00:06:44.817 00:06:44.817 Running for 1 seconds... 00:06:44.817 00:06:44.817 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:44.817 ------------------------------------------------------------------------------------ 00:06:44.817 0,0 551616/s 2154 MiB/s 0 0 00:06:44.817 ==================================================================================== 00:06:44.817 Total 551616/s 2154 MiB/s 0 0' 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:44.817 05:31:55 -- accel/accel.sh@12 -- # build_accel_config 00:06:44.817 05:31:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:44.817 05:31:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:44.817 05:31:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.817 05:31:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.817 05:31:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:44.817 05:31:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:44.817 05:31:55 -- accel/accel.sh@41 -- # local IFS=, 00:06:44.817 05:31:55 -- accel/accel.sh@42 -- # jq -r . 00:06:44.817 [2024-11-29 05:31:55.771770] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:44.817 [2024-11-29 05:31:55.771863] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2203255 ] 00:06:44.817 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.817 [2024-11-29 05:31:55.839465] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.817 [2024-11-29 05:31:55.873770] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val= 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val= 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val=0x1 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val= 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val= 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val=copy 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val= 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val=software 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@23 -- # accel_module=software 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val=32 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val=32 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val=1 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val=Yes 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val= 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:44.817 05:31:55 -- accel/accel.sh@21 -- # val= 00:06:44.817 05:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:44.817 05:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.754 05:31:57 -- accel/accel.sh@21 -- # val= 00:06:45.754 05:31:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # IFS=: 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # read -r var val 00:06:45.754 05:31:57 -- accel/accel.sh@21 -- # val= 00:06:45.754 05:31:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # IFS=: 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # read -r var val 00:06:45.754 05:31:57 -- accel/accel.sh@21 -- # val= 00:06:45.754 05:31:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # IFS=: 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # read -r var val 00:06:45.754 05:31:57 -- accel/accel.sh@21 -- # val= 00:06:45.754 05:31:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # IFS=: 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # read -r var val 00:06:45.754 05:31:57 -- accel/accel.sh@21 -- # val= 00:06:45.754 05:31:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # IFS=: 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # read -r var val 00:06:45.754 05:31:57 -- accel/accel.sh@21 -- # val= 00:06:45.754 05:31:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # IFS=: 00:06:45.754 05:31:57 -- accel/accel.sh@20 -- # read -r var val 00:06:45.754 05:31:57 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:45.754 05:31:57 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:45.754 05:31:57 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:45.754 00:06:45.754 real 0m2.564s 00:06:45.754 user 0m2.327s 00:06:45.754 sys 0m0.245s 00:06:45.754 05:31:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:45.754 05:31:57 -- common/autotest_common.sh@10 -- # set +x 00:06:45.754 ************************************ 00:06:45.754 END TEST accel_copy 00:06:45.754 ************************************ 00:06:46.012 05:31:57 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.012 05:31:57 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:46.012 05:31:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:46.012 05:31:57 -- common/autotest_common.sh@10 -- # set +x 00:06:46.012 ************************************ 00:06:46.012 START TEST accel_fill 00:06:46.012 ************************************ 00:06:46.012 05:31:57 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.012 05:31:57 -- accel/accel.sh@16 -- # local accel_opc 00:06:46.012 05:31:57 -- accel/accel.sh@17 -- # local accel_module 00:06:46.012 05:31:57 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.012 05:31:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.012 05:31:57 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.012 05:31:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.013 05:31:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.013 05:31:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.013 05:31:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.013 05:31:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.013 05:31:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.013 05:31:57 -- accel/accel.sh@42 -- # jq -r . 00:06:46.013 [2024-11-29 05:31:57.106577] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:46.013 [2024-11-29 05:31:57.106683] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2203538 ] 00:06:46.013 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.013 [2024-11-29 05:31:57.177684] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.013 [2024-11-29 05:31:57.213672] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.388 05:31:58 -- accel/accel.sh@18 -- # out=' 00:06:47.388 SPDK Configuration: 00:06:47.388 Core mask: 0x1 00:06:47.388 00:06:47.388 Accel Perf Configuration: 00:06:47.388 Workload Type: fill 00:06:47.388 Fill pattern: 0x80 00:06:47.388 Transfer size: 4096 bytes 00:06:47.388 Vector count 1 00:06:47.388 Module: software 00:06:47.388 Queue depth: 64 00:06:47.388 Allocate depth: 64 00:06:47.388 # threads/core: 1 00:06:47.388 Run time: 1 seconds 00:06:47.388 Verify: Yes 00:06:47.388 00:06:47.388 Running for 1 seconds... 00:06:47.388 00:06:47.388 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:47.388 ------------------------------------------------------------------------------------ 00:06:47.388 0,0 958144/s 3742 MiB/s 0 0 00:06:47.388 ==================================================================================== 00:06:47.388 Total 958144/s 3742 MiB/s 0 0' 00:06:47.388 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.388 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.388 05:31:58 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:47.388 05:31:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:47.388 05:31:58 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.388 05:31:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.388 05:31:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.388 05:31:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.388 05:31:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.388 05:31:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.388 05:31:58 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.388 05:31:58 -- accel/accel.sh@42 -- # jq -r . 00:06:47.388 [2024-11-29 05:31:58.395697] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:47.388 [2024-11-29 05:31:58.395792] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2203808 ] 00:06:47.388 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.388 [2024-11-29 05:31:58.463119] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.388 [2024-11-29 05:31:58.497740] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.388 05:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.388 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.388 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.388 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.388 05:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.388 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.388 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val=0x1 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val=fill 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val=0x80 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val=software 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@23 -- # accel_module=software 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val=64 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val=64 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val=1 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val=Yes 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.389 05:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.389 05:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.389 05:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:48.767 05:31:59 -- accel/accel.sh@21 -- # val= 00:06:48.767 05:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:48.767 05:31:59 -- accel/accel.sh@21 -- # val= 00:06:48.767 05:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:48.767 05:31:59 -- accel/accel.sh@21 -- # val= 00:06:48.767 05:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:48.767 05:31:59 -- accel/accel.sh@21 -- # val= 00:06:48.767 05:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:48.767 05:31:59 -- accel/accel.sh@21 -- # val= 00:06:48.767 05:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:48.767 05:31:59 -- accel/accel.sh@21 -- # val= 00:06:48.767 05:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:48.767 05:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:48.767 05:31:59 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:48.767 05:31:59 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:48.767 05:31:59 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:48.767 00:06:48.767 real 0m2.583s 00:06:48.767 user 0m2.344s 00:06:48.767 sys 0m0.247s 00:06:48.767 05:31:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:48.767 05:31:59 -- common/autotest_common.sh@10 -- # set +x 00:06:48.767 ************************************ 00:06:48.767 END TEST accel_fill 00:06:48.767 ************************************ 00:06:48.767 05:31:59 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:48.767 05:31:59 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:48.767 05:31:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:48.767 05:31:59 -- common/autotest_common.sh@10 -- # set +x 00:06:48.767 ************************************ 00:06:48.767 START TEST accel_copy_crc32c 00:06:48.767 ************************************ 00:06:48.767 05:31:59 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:06:48.767 05:31:59 -- accel/accel.sh@16 -- # local accel_opc 00:06:48.767 05:31:59 -- accel/accel.sh@17 -- # local accel_module 00:06:48.767 05:31:59 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:48.767 05:31:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:48.767 05:31:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.767 05:31:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:48.767 05:31:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.767 05:31:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.767 05:31:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:48.767 05:31:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:48.767 05:31:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:48.767 05:31:59 -- accel/accel.sh@42 -- # jq -r . 00:06:48.767 [2024-11-29 05:31:59.735800] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:48.767 [2024-11-29 05:31:59.735899] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2204098 ] 00:06:48.767 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.767 [2024-11-29 05:31:59.803670] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.767 [2024-11-29 05:31:59.838924] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.702 05:32:01 -- accel/accel.sh@18 -- # out=' 00:06:49.702 SPDK Configuration: 00:06:49.702 Core mask: 0x1 00:06:49.702 00:06:49.702 Accel Perf Configuration: 00:06:49.702 Workload Type: copy_crc32c 00:06:49.702 CRC-32C seed: 0 00:06:49.702 Vector size: 4096 bytes 00:06:49.702 Transfer size: 4096 bytes 00:06:49.702 Vector count 1 00:06:49.702 Module: software 00:06:49.702 Queue depth: 32 00:06:49.702 Allocate depth: 32 00:06:49.702 # threads/core: 1 00:06:49.702 Run time: 1 seconds 00:06:49.702 Verify: Yes 00:06:49.702 00:06:49.702 Running for 1 seconds... 00:06:49.702 00:06:49.702 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:49.702 ------------------------------------------------------------------------------------ 00:06:49.702 0,0 412480/s 1611 MiB/s 0 0 00:06:49.702 ==================================================================================== 00:06:49.702 Total 412480/s 1611 MiB/s 0 0' 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:49.961 05:32:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:49.961 05:32:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.961 05:32:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.961 05:32:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.961 05:32:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.961 05:32:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.961 05:32:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.961 05:32:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.961 05:32:01 -- accel/accel.sh@42 -- # jq -r . 00:06:49.961 [2024-11-29 05:32:01.024377] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:49.961 [2024-11-29 05:32:01.024468] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2204328 ] 00:06:49.961 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.961 [2024-11-29 05:32:01.091675] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.961 [2024-11-29 05:32:01.126465] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val= 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val= 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val=0x1 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val= 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val= 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val=0 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val= 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val=software 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@23 -- # accel_module=software 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val=32 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val=32 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val=1 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val=Yes 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val= 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:49.961 05:32:01 -- accel/accel.sh@21 -- # val= 00:06:49.961 05:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:49.961 05:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:51.335 05:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.335 05:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.335 05:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.335 05:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.335 05:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.335 05:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.335 05:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.335 05:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.335 05:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.336 05:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.336 05:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.336 05:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.336 05:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.336 05:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.336 05:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.336 05:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.336 05:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.336 05:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.336 05:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.336 05:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.336 05:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.336 05:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.336 05:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.336 05:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.336 05:32:02 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:51.336 05:32:02 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:51.336 05:32:02 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.336 00:06:51.336 real 0m2.581s 00:06:51.336 user 0m2.338s 00:06:51.336 sys 0m0.252s 00:06:51.336 05:32:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:51.336 05:32:02 -- common/autotest_common.sh@10 -- # set +x 00:06:51.336 ************************************ 00:06:51.336 END TEST accel_copy_crc32c 00:06:51.336 ************************************ 00:06:51.336 05:32:02 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:51.336 05:32:02 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:51.336 05:32:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:51.336 05:32:02 -- common/autotest_common.sh@10 -- # set +x 00:06:51.336 ************************************ 00:06:51.336 START TEST accel_copy_crc32c_C2 00:06:51.336 ************************************ 00:06:51.336 05:32:02 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:51.336 05:32:02 -- accel/accel.sh@16 -- # local accel_opc 00:06:51.336 05:32:02 -- accel/accel.sh@17 -- # local accel_module 00:06:51.336 05:32:02 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:51.336 05:32:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:51.336 05:32:02 -- accel/accel.sh@12 -- # build_accel_config 00:06:51.336 05:32:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:51.336 05:32:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.336 05:32:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.336 05:32:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:51.336 05:32:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:51.336 05:32:02 -- accel/accel.sh@41 -- # local IFS=, 00:06:51.336 05:32:02 -- accel/accel.sh@42 -- # jq -r . 00:06:51.336 [2024-11-29 05:32:02.366397] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:51.336 [2024-11-29 05:32:02.366507] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2204520 ] 00:06:51.336 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.336 [2024-11-29 05:32:02.435414] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.336 [2024-11-29 05:32:02.471280] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.710 05:32:03 -- accel/accel.sh@18 -- # out=' 00:06:52.710 SPDK Configuration: 00:06:52.710 Core mask: 0x1 00:06:52.710 00:06:52.710 Accel Perf Configuration: 00:06:52.710 Workload Type: copy_crc32c 00:06:52.710 CRC-32C seed: 0 00:06:52.710 Vector size: 4096 bytes 00:06:52.710 Transfer size: 8192 bytes 00:06:52.710 Vector count 2 00:06:52.710 Module: software 00:06:52.710 Queue depth: 32 00:06:52.710 Allocate depth: 32 00:06:52.710 # threads/core: 1 00:06:52.710 Run time: 1 seconds 00:06:52.710 Verify: Yes 00:06:52.710 00:06:52.710 Running for 1 seconds... 00:06:52.710 00:06:52.710 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:52.710 ------------------------------------------------------------------------------------ 00:06:52.710 0,0 300800/s 2350 MiB/s 0 0 00:06:52.710 ==================================================================================== 00:06:52.710 Total 300800/s 1175 MiB/s 0 0' 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.710 05:32:03 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:52.710 05:32:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:52.710 05:32:03 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.710 05:32:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.710 05:32:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.710 05:32:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.710 05:32:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.710 05:32:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.710 05:32:03 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.710 05:32:03 -- accel/accel.sh@42 -- # jq -r . 00:06:52.710 [2024-11-29 05:32:03.654828] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:52.710 [2024-11-29 05:32:03.654921] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2204677 ] 00:06:52.710 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.710 [2024-11-29 05:32:03.724251] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.710 [2024-11-29 05:32:03.759104] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.710 05:32:03 -- accel/accel.sh@21 -- # val= 00:06:52.710 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.710 05:32:03 -- accel/accel.sh@21 -- # val= 00:06:52.710 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.710 05:32:03 -- accel/accel.sh@21 -- # val=0x1 00:06:52.710 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.710 05:32:03 -- accel/accel.sh@21 -- # val= 00:06:52.710 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.710 05:32:03 -- accel/accel.sh@21 -- # val= 00:06:52.710 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.710 05:32:03 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:52.710 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.710 05:32:03 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:52.710 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val=0 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val= 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val=software 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@23 -- # accel_module=software 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val=32 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val=32 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val=1 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val=Yes 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val= 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:52.711 05:32:03 -- accel/accel.sh@21 -- # val= 00:06:52.711 05:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:52.711 05:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.647 05:32:04 -- accel/accel.sh@21 -- # val= 00:06:53.647 05:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:53.647 05:32:04 -- accel/accel.sh@21 -- # val= 00:06:53.647 05:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:53.647 05:32:04 -- accel/accel.sh@21 -- # val= 00:06:53.647 05:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:53.647 05:32:04 -- accel/accel.sh@21 -- # val= 00:06:53.647 05:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:53.647 05:32:04 -- accel/accel.sh@21 -- # val= 00:06:53.647 05:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:53.647 05:32:04 -- accel/accel.sh@21 -- # val= 00:06:53.647 05:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:53.647 05:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:53.647 05:32:04 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:53.647 05:32:04 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:53.647 05:32:04 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.647 00:06:53.647 real 0m2.582s 00:06:53.647 user 0m2.328s 00:06:53.647 sys 0m0.263s 00:06:53.647 05:32:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:53.647 05:32:04 -- common/autotest_common.sh@10 -- # set +x 00:06:53.647 ************************************ 00:06:53.647 END TEST accel_copy_crc32c_C2 00:06:53.647 ************************************ 00:06:53.904 05:32:04 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:53.904 05:32:04 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:53.904 05:32:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:53.904 05:32:04 -- common/autotest_common.sh@10 -- # set +x 00:06:53.904 ************************************ 00:06:53.904 START TEST accel_dualcast 00:06:53.904 ************************************ 00:06:53.904 05:32:04 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:06:53.904 05:32:04 -- accel/accel.sh@16 -- # local accel_opc 00:06:53.904 05:32:04 -- accel/accel.sh@17 -- # local accel_module 00:06:53.904 05:32:04 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:53.904 05:32:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:53.904 05:32:04 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.904 05:32:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.904 05:32:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.904 05:32:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.904 05:32:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.904 05:32:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.904 05:32:04 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.904 05:32:04 -- accel/accel.sh@42 -- # jq -r . 00:06:53.904 [2024-11-29 05:32:04.997739] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:53.904 [2024-11-29 05:32:04.997831] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2204953 ] 00:06:53.904 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.904 [2024-11-29 05:32:05.065757] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.904 [2024-11-29 05:32:05.100265] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.276 05:32:06 -- accel/accel.sh@18 -- # out=' 00:06:55.276 SPDK Configuration: 00:06:55.276 Core mask: 0x1 00:06:55.276 00:06:55.276 Accel Perf Configuration: 00:06:55.276 Workload Type: dualcast 00:06:55.276 Transfer size: 4096 bytes 00:06:55.276 Vector count 1 00:06:55.276 Module: software 00:06:55.276 Queue depth: 32 00:06:55.276 Allocate depth: 32 00:06:55.276 # threads/core: 1 00:06:55.276 Run time: 1 seconds 00:06:55.276 Verify: Yes 00:06:55.276 00:06:55.276 Running for 1 seconds... 00:06:55.276 00:06:55.276 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:55.276 ------------------------------------------------------------------------------------ 00:06:55.276 0,0 634208/s 2477 MiB/s 0 0 00:06:55.276 ==================================================================================== 00:06:55.276 Total 634208/s 2477 MiB/s 0 0' 00:06:55.276 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.276 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.276 05:32:06 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:55.276 05:32:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:55.276 05:32:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.276 05:32:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.276 05:32:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.276 05:32:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.276 05:32:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.276 05:32:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.276 05:32:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.276 05:32:06 -- accel/accel.sh@42 -- # jq -r . 00:06:55.276 [2024-11-29 05:32:06.283655] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:55.276 [2024-11-29 05:32:06.283748] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2205221 ] 00:06:55.276 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.276 [2024-11-29 05:32:06.352661] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.277 [2024-11-29 05:32:06.387397] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val=0x1 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val=dualcast 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val=software 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@23 -- # accel_module=software 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val=32 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val=32 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val=1 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val=Yes 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.277 05:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.277 05:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.277 05:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:56.650 05:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.650 05:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.650 05:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.650 05:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.650 05:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.650 05:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.650 05:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.650 05:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.650 05:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.650 05:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.650 05:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.650 05:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.650 05:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.650 05:32:07 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:56.650 05:32:07 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:56.650 05:32:07 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.650 00:06:56.650 real 0m2.580s 00:06:56.650 user 0m2.335s 00:06:56.650 sys 0m0.253s 00:06:56.650 05:32:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:56.650 05:32:07 -- common/autotest_common.sh@10 -- # set +x 00:06:56.650 ************************************ 00:06:56.650 END TEST accel_dualcast 00:06:56.650 ************************************ 00:06:56.650 05:32:07 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:56.650 05:32:07 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:56.650 05:32:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:56.650 05:32:07 -- common/autotest_common.sh@10 -- # set +x 00:06:56.650 ************************************ 00:06:56.650 START TEST accel_compare 00:06:56.650 ************************************ 00:06:56.650 05:32:07 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:06:56.650 05:32:07 -- accel/accel.sh@16 -- # local accel_opc 00:06:56.650 05:32:07 -- accel/accel.sh@17 -- # local accel_module 00:06:56.650 05:32:07 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:56.650 05:32:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:56.650 05:32:07 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.650 05:32:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.650 05:32:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.650 05:32:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.650 05:32:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.650 05:32:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.650 05:32:07 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.650 05:32:07 -- accel/accel.sh@42 -- # jq -r . 00:06:56.650 [2024-11-29 05:32:07.625247] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:56.650 [2024-11-29 05:32:07.625351] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2205507 ] 00:06:56.650 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.650 [2024-11-29 05:32:07.693320] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.650 [2024-11-29 05:32:07.728806] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.026 05:32:08 -- accel/accel.sh@18 -- # out=' 00:06:58.026 SPDK Configuration: 00:06:58.026 Core mask: 0x1 00:06:58.026 00:06:58.026 Accel Perf Configuration: 00:06:58.026 Workload Type: compare 00:06:58.026 Transfer size: 4096 bytes 00:06:58.026 Vector count 1 00:06:58.026 Module: software 00:06:58.026 Queue depth: 32 00:06:58.026 Allocate depth: 32 00:06:58.026 # threads/core: 1 00:06:58.026 Run time: 1 seconds 00:06:58.026 Verify: Yes 00:06:58.026 00:06:58.026 Running for 1 seconds... 00:06:58.026 00:06:58.026 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:58.026 ------------------------------------------------------------------------------------ 00:06:58.026 0,0 832672/s 3252 MiB/s 0 0 00:06:58.026 ==================================================================================== 00:06:58.026 Total 832672/s 3252 MiB/s 0 0' 00:06:58.026 05:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:08 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:58.026 05:32:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:58.026 05:32:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.026 05:32:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.026 05:32:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.026 05:32:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.026 05:32:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.026 05:32:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.026 05:32:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.026 05:32:08 -- accel/accel.sh@42 -- # jq -r . 00:06:58.026 [2024-11-29 05:32:08.911707] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:58.026 [2024-11-29 05:32:08.911816] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2205782 ] 00:06:58.026 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.026 [2024-11-29 05:32:08.979359] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.026 [2024-11-29 05:32:09.013774] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val= 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val= 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val=0x1 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val= 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val= 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val=compare 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val= 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val=software 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@23 -- # accel_module=software 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val=32 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val=32 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val=1 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val=Yes 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val= 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.026 05:32:09 -- accel/accel.sh@21 -- # val= 00:06:58.026 05:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:58.026 05:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:58.961 05:32:10 -- accel/accel.sh@21 -- # val= 00:06:58.961 05:32:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # IFS=: 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # read -r var val 00:06:58.961 05:32:10 -- accel/accel.sh@21 -- # val= 00:06:58.961 05:32:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # IFS=: 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # read -r var val 00:06:58.961 05:32:10 -- accel/accel.sh@21 -- # val= 00:06:58.961 05:32:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # IFS=: 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # read -r var val 00:06:58.961 05:32:10 -- accel/accel.sh@21 -- # val= 00:06:58.961 05:32:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # IFS=: 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # read -r var val 00:06:58.961 05:32:10 -- accel/accel.sh@21 -- # val= 00:06:58.961 05:32:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # IFS=: 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # read -r var val 00:06:58.961 05:32:10 -- accel/accel.sh@21 -- # val= 00:06:58.961 05:32:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # IFS=: 00:06:58.961 05:32:10 -- accel/accel.sh@20 -- # read -r var val 00:06:58.961 05:32:10 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:58.961 05:32:10 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:58.961 05:32:10 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.961 00:06:58.961 real 0m2.580s 00:06:58.961 user 0m2.318s 00:06:58.961 sys 0m0.268s 00:06:58.961 05:32:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:58.961 05:32:10 -- common/autotest_common.sh@10 -- # set +x 00:06:58.961 ************************************ 00:06:58.961 END TEST accel_compare 00:06:58.961 ************************************ 00:06:58.961 05:32:10 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:58.961 05:32:10 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:58.961 05:32:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:58.961 05:32:10 -- common/autotest_common.sh@10 -- # set +x 00:06:58.961 ************************************ 00:06:58.961 START TEST accel_xor 00:06:58.961 ************************************ 00:06:58.961 05:32:10 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:06:58.961 05:32:10 -- accel/accel.sh@16 -- # local accel_opc 00:06:58.961 05:32:10 -- accel/accel.sh@17 -- # local accel_module 00:06:58.961 05:32:10 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:58.961 05:32:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:58.961 05:32:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.961 05:32:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.961 05:32:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.961 05:32:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.961 05:32:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.961 05:32:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.961 05:32:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.961 05:32:10 -- accel/accel.sh@42 -- # jq -r . 00:06:58.961 [2024-11-29 05:32:10.253793] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:58.961 [2024-11-29 05:32:10.253882] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2206036 ] 00:06:59.219 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.219 [2024-11-29 05:32:10.322974] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.219 [2024-11-29 05:32:10.359272] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.643 05:32:11 -- accel/accel.sh@18 -- # out=' 00:07:00.643 SPDK Configuration: 00:07:00.643 Core mask: 0x1 00:07:00.643 00:07:00.643 Accel Perf Configuration: 00:07:00.643 Workload Type: xor 00:07:00.643 Source buffers: 2 00:07:00.643 Transfer size: 4096 bytes 00:07:00.643 Vector count 1 00:07:00.644 Module: software 00:07:00.644 Queue depth: 32 00:07:00.644 Allocate depth: 32 00:07:00.644 # threads/core: 1 00:07:00.644 Run time: 1 seconds 00:07:00.644 Verify: Yes 00:07:00.644 00:07:00.644 Running for 1 seconds... 00:07:00.644 00:07:00.644 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:00.644 ------------------------------------------------------------------------------------ 00:07:00.644 0,0 670400/s 2618 MiB/s 0 0 00:07:00.644 ==================================================================================== 00:07:00.644 Total 670400/s 2618 MiB/s 0 0' 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:00.644 05:32:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:00.644 05:32:11 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.644 05:32:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.644 05:32:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.644 05:32:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.644 05:32:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.644 05:32:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.644 05:32:11 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.644 05:32:11 -- accel/accel.sh@42 -- # jq -r . 00:07:00.644 [2024-11-29 05:32:11.542497] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:00.644 [2024-11-29 05:32:11.542595] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2206180 ] 00:07:00.644 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.644 [2024-11-29 05:32:11.611523] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.644 [2024-11-29 05:32:11.646074] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val=0x1 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val=xor 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val=2 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val=software 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@23 -- # accel_module=software 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val=32 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val=32 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val=1 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val=Yes 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.644 05:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.644 05:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.644 05:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:01.576 05:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.576 05:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.576 05:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.576 05:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.576 05:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.576 05:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.576 05:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.576 05:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.576 05:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.576 05:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.576 05:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.576 05:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.576 05:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.576 05:32:12 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:01.576 05:32:12 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:01.576 05:32:12 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.576 00:07:01.576 real 0m2.579s 00:07:01.576 user 0m2.327s 00:07:01.576 sys 0m0.260s 00:07:01.576 05:32:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:01.576 05:32:12 -- common/autotest_common.sh@10 -- # set +x 00:07:01.576 ************************************ 00:07:01.576 END TEST accel_xor 00:07:01.576 ************************************ 00:07:01.576 05:32:12 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:01.576 05:32:12 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:01.576 05:32:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:01.576 05:32:12 -- common/autotest_common.sh@10 -- # set +x 00:07:01.576 ************************************ 00:07:01.576 START TEST accel_xor 00:07:01.576 ************************************ 00:07:01.576 05:32:12 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:07:01.576 05:32:12 -- accel/accel.sh@16 -- # local accel_opc 00:07:01.576 05:32:12 -- accel/accel.sh@17 -- # local accel_module 00:07:01.576 05:32:12 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:07:01.576 05:32:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:01.576 05:32:12 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.576 05:32:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:01.576 05:32:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.576 05:32:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.576 05:32:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:01.576 05:32:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:01.576 05:32:12 -- accel/accel.sh@41 -- # local IFS=, 00:07:01.576 05:32:12 -- accel/accel.sh@42 -- # jq -r . 00:07:01.833 [2024-11-29 05:32:12.884147] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:01.833 [2024-11-29 05:32:12.884238] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2206374 ] 00:07:01.833 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.833 [2024-11-29 05:32:12.955255] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.833 [2024-11-29 05:32:12.990765] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.205 05:32:14 -- accel/accel.sh@18 -- # out=' 00:07:03.205 SPDK Configuration: 00:07:03.205 Core mask: 0x1 00:07:03.205 00:07:03.205 Accel Perf Configuration: 00:07:03.205 Workload Type: xor 00:07:03.205 Source buffers: 3 00:07:03.205 Transfer size: 4096 bytes 00:07:03.205 Vector count 1 00:07:03.205 Module: software 00:07:03.205 Queue depth: 32 00:07:03.205 Allocate depth: 32 00:07:03.205 # threads/core: 1 00:07:03.205 Run time: 1 seconds 00:07:03.205 Verify: Yes 00:07:03.205 00:07:03.205 Running for 1 seconds... 00:07:03.205 00:07:03.205 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:03.205 ------------------------------------------------------------------------------------ 00:07:03.205 0,0 661152/s 2582 MiB/s 0 0 00:07:03.205 ==================================================================================== 00:07:03.205 Total 661152/s 2582 MiB/s 0 0' 00:07:03.205 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.205 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.205 05:32:14 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:03.205 05:32:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:03.205 05:32:14 -- accel/accel.sh@12 -- # build_accel_config 00:07:03.205 05:32:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.205 05:32:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.205 05:32:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.205 05:32:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.205 05:32:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.205 05:32:14 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.205 05:32:14 -- accel/accel.sh@42 -- # jq -r . 00:07:03.205 [2024-11-29 05:32:14.172591] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:03.205 [2024-11-29 05:32:14.172692] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2206641 ] 00:07:03.205 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.205 [2024-11-29 05:32:14.239702] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.205 [2024-11-29 05:32:14.274117] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.205 05:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.205 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.205 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.205 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.205 05:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.205 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.205 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.205 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.205 05:32:14 -- accel/accel.sh@21 -- # val=0x1 00:07:03.205 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.205 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.205 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.205 05:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.205 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.205 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.205 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val=xor 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val=3 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val=software 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@23 -- # accel_module=software 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val=32 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val=32 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val=1 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val=Yes 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.206 05:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.206 05:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.206 05:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:04.136 05:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.136 05:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.136 05:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.136 05:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.136 05:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.136 05:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.136 05:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.136 05:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.136 05:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.395 05:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.395 05:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.395 05:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.395 05:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.395 05:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.395 05:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.395 05:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.395 05:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.395 05:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.395 05:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.395 05:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.395 05:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.395 05:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.395 05:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.395 05:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.395 05:32:15 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:04.395 05:32:15 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:04.395 05:32:15 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.395 00:07:04.395 real 0m2.581s 00:07:04.395 user 0m2.322s 00:07:04.395 sys 0m0.267s 00:07:04.395 05:32:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:04.395 05:32:15 -- common/autotest_common.sh@10 -- # set +x 00:07:04.395 ************************************ 00:07:04.395 END TEST accel_xor 00:07:04.395 ************************************ 00:07:04.395 05:32:15 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:04.395 05:32:15 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:04.395 05:32:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:04.395 05:32:15 -- common/autotest_common.sh@10 -- # set +x 00:07:04.395 ************************************ 00:07:04.395 START TEST accel_dif_verify 00:07:04.395 ************************************ 00:07:04.395 05:32:15 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:07:04.395 05:32:15 -- accel/accel.sh@16 -- # local accel_opc 00:07:04.395 05:32:15 -- accel/accel.sh@17 -- # local accel_module 00:07:04.395 05:32:15 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:07:04.395 05:32:15 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.395 05:32:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:04.395 05:32:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.395 05:32:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.395 05:32:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.395 05:32:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.395 05:32:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.395 05:32:15 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.395 05:32:15 -- accel/accel.sh@42 -- # jq -r . 00:07:04.395 [2024-11-29 05:32:15.513022] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:04.395 [2024-11-29 05:32:15.513107] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2206928 ] 00:07:04.395 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.395 [2024-11-29 05:32:15.580702] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.395 [2024-11-29 05:32:15.616280] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.768 05:32:16 -- accel/accel.sh@18 -- # out=' 00:07:05.768 SPDK Configuration: 00:07:05.768 Core mask: 0x1 00:07:05.768 00:07:05.768 Accel Perf Configuration: 00:07:05.768 Workload Type: dif_verify 00:07:05.768 Vector size: 4096 bytes 00:07:05.768 Transfer size: 4096 bytes 00:07:05.768 Block size: 512 bytes 00:07:05.768 Metadata size: 8 bytes 00:07:05.768 Vector count 1 00:07:05.768 Module: software 00:07:05.768 Queue depth: 32 00:07:05.768 Allocate depth: 32 00:07:05.768 # threads/core: 1 00:07:05.768 Run time: 1 seconds 00:07:05.768 Verify: No 00:07:05.768 00:07:05.768 Running for 1 seconds... 00:07:05.768 00:07:05.768 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:05.768 ------------------------------------------------------------------------------------ 00:07:05.768 0,0 247744/s 982 MiB/s 0 0 00:07:05.768 ==================================================================================== 00:07:05.768 Total 247744/s 967 MiB/s 0 0' 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:05.768 05:32:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:05.768 05:32:16 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.768 05:32:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:05.768 05:32:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.768 05:32:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.768 05:32:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:05.768 05:32:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:05.768 05:32:16 -- accel/accel.sh@41 -- # local IFS=, 00:07:05.768 05:32:16 -- accel/accel.sh@42 -- # jq -r . 00:07:05.768 [2024-11-29 05:32:16.798151] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:05.768 [2024-11-29 05:32:16.798244] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2207195 ] 00:07:05.768 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.768 [2024-11-29 05:32:16.865970] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.768 [2024-11-29 05:32:16.899670] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val= 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val= 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val=0x1 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val= 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val= 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val=dif_verify 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val= 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val=software 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@23 -- # accel_module=software 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.768 05:32:16 -- accel/accel.sh@21 -- # val=32 00:07:05.768 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.768 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.769 05:32:16 -- accel/accel.sh@21 -- # val=32 00:07:05.769 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.769 05:32:16 -- accel/accel.sh@21 -- # val=1 00:07:05.769 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.769 05:32:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:05.769 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.769 05:32:16 -- accel/accel.sh@21 -- # val=No 00:07:05.769 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.769 05:32:16 -- accel/accel.sh@21 -- # val= 00:07:05.769 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.769 05:32:16 -- accel/accel.sh@21 -- # val= 00:07:05.769 05:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.769 05:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 05:32:18 -- accel/accel.sh@21 -- # val= 00:07:07.141 05:32:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 05:32:18 -- accel/accel.sh@21 -- # val= 00:07:07.141 05:32:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 05:32:18 -- accel/accel.sh@21 -- # val= 00:07:07.141 05:32:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 05:32:18 -- accel/accel.sh@21 -- # val= 00:07:07.141 05:32:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 05:32:18 -- accel/accel.sh@21 -- # val= 00:07:07.141 05:32:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 05:32:18 -- accel/accel.sh@21 -- # val= 00:07:07.141 05:32:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # IFS=: 00:07:07.141 05:32:18 -- accel/accel.sh@20 -- # read -r var val 00:07:07.141 05:32:18 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:07.141 05:32:18 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:07.141 05:32:18 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:07.141 00:07:07.141 real 0m2.576s 00:07:07.141 user 0m2.335s 00:07:07.141 sys 0m0.251s 00:07:07.141 05:32:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:07.141 05:32:18 -- common/autotest_common.sh@10 -- # set +x 00:07:07.141 ************************************ 00:07:07.141 END TEST accel_dif_verify 00:07:07.141 ************************************ 00:07:07.141 05:32:18 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:07.141 05:32:18 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:07.141 05:32:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:07.141 05:32:18 -- common/autotest_common.sh@10 -- # set +x 00:07:07.141 ************************************ 00:07:07.141 START TEST accel_dif_generate 00:07:07.141 ************************************ 00:07:07.141 05:32:18 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:07:07.141 05:32:18 -- accel/accel.sh@16 -- # local accel_opc 00:07:07.141 05:32:18 -- accel/accel.sh@17 -- # local accel_module 00:07:07.141 05:32:18 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:07.141 05:32:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:07.141 05:32:18 -- accel/accel.sh@12 -- # build_accel_config 00:07:07.141 05:32:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:07.141 05:32:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.141 05:32:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.141 05:32:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:07.142 05:32:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:07.142 05:32:18 -- accel/accel.sh@41 -- # local IFS=, 00:07:07.142 05:32:18 -- accel/accel.sh@42 -- # jq -r . 00:07:07.142 [2024-11-29 05:32:18.136356] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:07.142 [2024-11-29 05:32:18.136445] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2207484 ] 00:07:07.142 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.142 [2024-11-29 05:32:18.203803] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.142 [2024-11-29 05:32:18.238737] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.520 05:32:19 -- accel/accel.sh@18 -- # out=' 00:07:08.520 SPDK Configuration: 00:07:08.520 Core mask: 0x1 00:07:08.520 00:07:08.520 Accel Perf Configuration: 00:07:08.520 Workload Type: dif_generate 00:07:08.520 Vector size: 4096 bytes 00:07:08.520 Transfer size: 4096 bytes 00:07:08.520 Block size: 512 bytes 00:07:08.520 Metadata size: 8 bytes 00:07:08.520 Vector count 1 00:07:08.520 Module: software 00:07:08.520 Queue depth: 32 00:07:08.520 Allocate depth: 32 00:07:08.520 # threads/core: 1 00:07:08.520 Run time: 1 seconds 00:07:08.520 Verify: No 00:07:08.520 00:07:08.520 Running for 1 seconds... 00:07:08.520 00:07:08.520 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:08.520 ------------------------------------------------------------------------------------ 00:07:08.520 0,0 283200/s 1123 MiB/s 0 0 00:07:08.520 ==================================================================================== 00:07:08.520 Total 283200/s 1106 MiB/s 0 0' 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.520 05:32:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:08.520 05:32:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:08.520 05:32:19 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.520 05:32:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:08.520 05:32:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.520 05:32:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.520 05:32:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:08.520 05:32:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:08.520 05:32:19 -- accel/accel.sh@41 -- # local IFS=, 00:07:08.520 05:32:19 -- accel/accel.sh@42 -- # jq -r . 00:07:08.520 [2024-11-29 05:32:19.419650] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:08.520 [2024-11-29 05:32:19.419741] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2207688 ] 00:07:08.520 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.520 [2024-11-29 05:32:19.488677] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.520 [2024-11-29 05:32:19.524518] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.520 05:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.520 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.520 05:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.520 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.520 05:32:19 -- accel/accel.sh@21 -- # val=0x1 00:07:08.520 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.520 05:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.520 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.520 05:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.520 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.520 05:32:19 -- accel/accel.sh@21 -- # val=dif_generate 00:07:08.520 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.520 05:32:19 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.520 05:32:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:08.520 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.520 05:32:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:08.520 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.520 05:32:19 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:08.520 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.520 05:32:19 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:08.520 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.520 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.520 05:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.521 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.521 05:32:19 -- accel/accel.sh@21 -- # val=software 00:07:08.521 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.521 05:32:19 -- accel/accel.sh@23 -- # accel_module=software 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.521 05:32:19 -- accel/accel.sh@21 -- # val=32 00:07:08.521 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.521 05:32:19 -- accel/accel.sh@21 -- # val=32 00:07:08.521 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.521 05:32:19 -- accel/accel.sh@21 -- # val=1 00:07:08.521 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.521 05:32:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:08.521 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.521 05:32:19 -- accel/accel.sh@21 -- # val=No 00:07:08.521 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.521 05:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.521 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.521 05:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.521 05:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.521 05:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:09.536 05:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.536 05:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.536 05:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.536 05:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.536 05:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.536 05:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.536 05:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.536 05:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.536 05:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.536 05:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.536 05:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.536 05:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.536 05:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.536 05:32:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:09.536 05:32:20 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:09.536 05:32:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:09.536 00:07:09.536 real 0m2.577s 00:07:09.536 user 0m2.327s 00:07:09.536 sys 0m0.259s 00:07:09.536 05:32:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:09.536 05:32:20 -- common/autotest_common.sh@10 -- # set +x 00:07:09.536 ************************************ 00:07:09.536 END TEST accel_dif_generate 00:07:09.536 ************************************ 00:07:09.536 05:32:20 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:09.536 05:32:20 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:09.536 05:32:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:09.536 05:32:20 -- common/autotest_common.sh@10 -- # set +x 00:07:09.536 ************************************ 00:07:09.536 START TEST accel_dif_generate_copy 00:07:09.536 ************************************ 00:07:09.536 05:32:20 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:07:09.536 05:32:20 -- accel/accel.sh@16 -- # local accel_opc 00:07:09.536 05:32:20 -- accel/accel.sh@17 -- # local accel_module 00:07:09.536 05:32:20 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:09.536 05:32:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:09.536 05:32:20 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.536 05:32:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.536 05:32:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.536 05:32:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.536 05:32:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.536 05:32:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.536 05:32:20 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.536 05:32:20 -- accel/accel.sh@42 -- # jq -r . 00:07:09.536 [2024-11-29 05:32:20.755786] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:09.536 [2024-11-29 05:32:20.755881] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2207885 ] 00:07:09.536 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.536 [2024-11-29 05:32:20.824449] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.794 [2024-11-29 05:32:20.861122] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.727 05:32:22 -- accel/accel.sh@18 -- # out=' 00:07:10.727 SPDK Configuration: 00:07:10.727 Core mask: 0x1 00:07:10.727 00:07:10.727 Accel Perf Configuration: 00:07:10.727 Workload Type: dif_generate_copy 00:07:10.727 Vector size: 4096 bytes 00:07:10.727 Transfer size: 4096 bytes 00:07:10.727 Vector count 1 00:07:10.727 Module: software 00:07:10.727 Queue depth: 32 00:07:10.727 Allocate depth: 32 00:07:10.727 # threads/core: 1 00:07:10.727 Run time: 1 seconds 00:07:10.727 Verify: No 00:07:10.727 00:07:10.727 Running for 1 seconds... 00:07:10.727 00:07:10.727 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:10.727 ------------------------------------------------------------------------------------ 00:07:10.727 0,0 226080/s 896 MiB/s 0 0 00:07:10.727 ==================================================================================== 00:07:10.727 Total 226080/s 883 MiB/s 0 0' 00:07:10.727 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.727 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.727 05:32:22 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:10.727 05:32:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:10.727 05:32:22 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.727 05:32:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:10.727 05:32:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.727 05:32:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.728 05:32:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:10.728 05:32:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:10.728 05:32:22 -- accel/accel.sh@41 -- # local IFS=, 00:07:10.728 05:32:22 -- accel/accel.sh@42 -- # jq -r . 00:07:10.986 [2024-11-29 05:32:22.043827] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:10.986 [2024-11-29 05:32:22.043920] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2208057 ] 00:07:10.986 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.986 [2024-11-29 05:32:22.112919] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.986 [2024-11-29 05:32:22.147832] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val= 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val= 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val=0x1 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val= 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val= 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val= 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val=software 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@23 -- # accel_module=software 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val=32 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val=32 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val=1 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val=No 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val= 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:10.986 05:32:22 -- accel/accel.sh@21 -- # val= 00:07:10.986 05:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:10.986 05:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:12.359 05:32:23 -- accel/accel.sh@21 -- # val= 00:07:12.359 05:32:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # IFS=: 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # read -r var val 00:07:12.359 05:32:23 -- accel/accel.sh@21 -- # val= 00:07:12.359 05:32:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # IFS=: 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # read -r var val 00:07:12.359 05:32:23 -- accel/accel.sh@21 -- # val= 00:07:12.359 05:32:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # IFS=: 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # read -r var val 00:07:12.359 05:32:23 -- accel/accel.sh@21 -- # val= 00:07:12.359 05:32:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # IFS=: 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # read -r var val 00:07:12.359 05:32:23 -- accel/accel.sh@21 -- # val= 00:07:12.359 05:32:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # IFS=: 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # read -r var val 00:07:12.359 05:32:23 -- accel/accel.sh@21 -- # val= 00:07:12.359 05:32:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # IFS=: 00:07:12.359 05:32:23 -- accel/accel.sh@20 -- # read -r var val 00:07:12.359 05:32:23 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:12.359 05:32:23 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:12.359 05:32:23 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.359 00:07:12.359 real 0m2.578s 00:07:12.359 user 0m2.331s 00:07:12.359 sys 0m0.255s 00:07:12.359 05:32:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:12.359 05:32:23 -- common/autotest_common.sh@10 -- # set +x 00:07:12.359 ************************************ 00:07:12.359 END TEST accel_dif_generate_copy 00:07:12.359 ************************************ 00:07:12.359 05:32:23 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:12.359 05:32:23 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.359 05:32:23 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:12.359 05:32:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:12.359 05:32:23 -- common/autotest_common.sh@10 -- # set +x 00:07:12.359 ************************************ 00:07:12.359 START TEST accel_comp 00:07:12.359 ************************************ 00:07:12.359 05:32:23 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.359 05:32:23 -- accel/accel.sh@16 -- # local accel_opc 00:07:12.359 05:32:23 -- accel/accel.sh@17 -- # local accel_module 00:07:12.359 05:32:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.359 05:32:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.359 05:32:23 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.359 05:32:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.359 05:32:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.359 05:32:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.359 05:32:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.359 05:32:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.359 05:32:23 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.359 05:32:23 -- accel/accel.sh@42 -- # jq -r . 00:07:12.359 [2024-11-29 05:32:23.382382] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:12.359 [2024-11-29 05:32:23.382473] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2208347 ] 00:07:12.359 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.359 [2024-11-29 05:32:23.451148] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.359 [2024-11-29 05:32:23.486516] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.732 05:32:24 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:13.732 00:07:13.732 SPDK Configuration: 00:07:13.732 Core mask: 0x1 00:07:13.732 00:07:13.732 Accel Perf Configuration: 00:07:13.732 Workload Type: compress 00:07:13.732 Transfer size: 4096 bytes 00:07:13.733 Vector count 1 00:07:13.733 Module: software 00:07:13.733 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.733 Queue depth: 32 00:07:13.733 Allocate depth: 32 00:07:13.733 # threads/core: 1 00:07:13.733 Run time: 1 seconds 00:07:13.733 Verify: No 00:07:13.733 00:07:13.733 Running for 1 seconds... 00:07:13.733 00:07:13.733 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:13.733 ------------------------------------------------------------------------------------ 00:07:13.733 0,0 67712/s 282 MiB/s 0 0 00:07:13.733 ==================================================================================== 00:07:13.733 Total 67712/s 264 MiB/s 0 0' 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.733 05:32:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.733 05:32:24 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.733 05:32:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:13.733 05:32:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.733 05:32:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.733 05:32:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:13.733 05:32:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:13.733 05:32:24 -- accel/accel.sh@41 -- # local IFS=, 00:07:13.733 05:32:24 -- accel/accel.sh@42 -- # jq -r . 00:07:13.733 [2024-11-29 05:32:24.668934] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:13.733 [2024-11-29 05:32:24.669035] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2208618 ] 00:07:13.733 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.733 [2024-11-29 05:32:24.736432] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.733 [2024-11-29 05:32:24.771068] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val=0x1 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val=compress 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val=software 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@23 -- # accel_module=software 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val=32 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val=32 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val=1 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val=No 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.733 05:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.733 05:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.733 05:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:14.682 05:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.682 05:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.682 05:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.682 05:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.682 05:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.682 05:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.682 05:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.682 05:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.682 05:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.682 05:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.682 05:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.682 05:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.682 05:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.682 05:32:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:14.682 05:32:25 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:14.682 05:32:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.682 00:07:14.682 real 0m2.578s 00:07:14.682 user 0m2.324s 00:07:14.682 sys 0m0.261s 00:07:14.682 05:32:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:14.682 05:32:25 -- common/autotest_common.sh@10 -- # set +x 00:07:14.682 ************************************ 00:07:14.682 END TEST accel_comp 00:07:14.682 ************************************ 00:07:14.682 05:32:25 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:14.682 05:32:25 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:14.682 05:32:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:14.682 05:32:25 -- common/autotest_common.sh@10 -- # set +x 00:07:14.940 ************************************ 00:07:14.940 START TEST accel_decomp 00:07:14.940 ************************************ 00:07:14.940 05:32:25 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:14.940 05:32:25 -- accel/accel.sh@16 -- # local accel_opc 00:07:14.940 05:32:25 -- accel/accel.sh@17 -- # local accel_module 00:07:14.940 05:32:25 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:14.940 05:32:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:14.940 05:32:25 -- accel/accel.sh@12 -- # build_accel_config 00:07:14.940 05:32:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:14.940 05:32:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.940 05:32:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.940 05:32:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:14.940 05:32:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:14.940 05:32:25 -- accel/accel.sh@41 -- # local IFS=, 00:07:14.940 05:32:25 -- accel/accel.sh@42 -- # jq -r . 00:07:14.940 [2024-11-29 05:32:26.010390] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:14.940 [2024-11-29 05:32:26.010479] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2208899 ] 00:07:14.940 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.940 [2024-11-29 05:32:26.078864] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.940 [2024-11-29 05:32:26.114050] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.315 05:32:27 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:16.315 00:07:16.315 SPDK Configuration: 00:07:16.315 Core mask: 0x1 00:07:16.315 00:07:16.315 Accel Perf Configuration: 00:07:16.315 Workload Type: decompress 00:07:16.315 Transfer size: 4096 bytes 00:07:16.315 Vector count 1 00:07:16.315 Module: software 00:07:16.315 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.315 Queue depth: 32 00:07:16.315 Allocate depth: 32 00:07:16.315 # threads/core: 1 00:07:16.315 Run time: 1 seconds 00:07:16.315 Verify: Yes 00:07:16.315 00:07:16.315 Running for 1 seconds... 00:07:16.315 00:07:16.315 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:16.315 ------------------------------------------------------------------------------------ 00:07:16.315 0,0 92832/s 171 MiB/s 0 0 00:07:16.315 ==================================================================================== 00:07:16.315 Total 92832/s 362 MiB/s 0 0' 00:07:16.315 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.315 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.315 05:32:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:16.315 05:32:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:16.315 05:32:27 -- accel/accel.sh@12 -- # build_accel_config 00:07:16.315 05:32:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:16.315 05:32:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.315 05:32:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.315 05:32:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:16.315 05:32:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:16.315 05:32:27 -- accel/accel.sh@41 -- # local IFS=, 00:07:16.315 05:32:27 -- accel/accel.sh@42 -- # jq -r . 00:07:16.315 [2024-11-29 05:32:27.297711] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:16.315 [2024-11-29 05:32:27.297803] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2209167 ] 00:07:16.315 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.315 [2024-11-29 05:32:27.365646] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.315 [2024-11-29 05:32:27.400177] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.315 05:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.315 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.315 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.315 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.315 05:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.315 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.315 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.315 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.315 05:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val=0x1 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val=decompress 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val=software 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@23 -- # accel_module=software 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val=32 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val=32 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val=1 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val=Yes 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.316 05:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.316 05:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.316 05:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:17.692 05:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.692 05:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.692 05:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.692 05:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.692 05:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.692 05:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.692 05:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.692 05:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.692 05:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.692 05:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.692 05:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.692 05:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.692 05:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.692 05:32:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:17.692 05:32:28 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:17.692 05:32:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.692 00:07:17.692 real 0m2.583s 00:07:17.692 user 0m2.332s 00:07:17.692 sys 0m0.259s 00:07:17.692 05:32:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:17.692 05:32:28 -- common/autotest_common.sh@10 -- # set +x 00:07:17.692 ************************************ 00:07:17.692 END TEST accel_decomp 00:07:17.692 ************************************ 00:07:17.692 05:32:28 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:17.692 05:32:28 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:17.692 05:32:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:17.692 05:32:28 -- common/autotest_common.sh@10 -- # set +x 00:07:17.692 ************************************ 00:07:17.692 START TEST accel_decmop_full 00:07:17.692 ************************************ 00:07:17.692 05:32:28 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:17.692 05:32:28 -- accel/accel.sh@16 -- # local accel_opc 00:07:17.692 05:32:28 -- accel/accel.sh@17 -- # local accel_module 00:07:17.692 05:32:28 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:17.692 05:32:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:17.692 05:32:28 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.692 05:32:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.692 05:32:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.692 05:32:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.692 05:32:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.692 05:32:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.692 05:32:28 -- accel/accel.sh@41 -- # local IFS=, 00:07:17.692 05:32:28 -- accel/accel.sh@42 -- # jq -r . 00:07:17.692 [2024-11-29 05:32:28.639899] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:17.692 [2024-11-29 05:32:28.639992] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2209381 ] 00:07:17.692 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.692 [2024-11-29 05:32:28.709311] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.692 [2024-11-29 05:32:28.744933] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.627 05:32:29 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:18.627 00:07:18.627 SPDK Configuration: 00:07:18.627 Core mask: 0x1 00:07:18.627 00:07:18.627 Accel Perf Configuration: 00:07:18.627 Workload Type: decompress 00:07:18.627 Transfer size: 111250 bytes 00:07:18.627 Vector count 1 00:07:18.627 Module: software 00:07:18.627 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:18.627 Queue depth: 32 00:07:18.627 Allocate depth: 32 00:07:18.627 # threads/core: 1 00:07:18.627 Run time: 1 seconds 00:07:18.627 Verify: Yes 00:07:18.627 00:07:18.627 Running for 1 seconds... 00:07:18.627 00:07:18.627 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:18.627 ------------------------------------------------------------------------------------ 00:07:18.628 0,0 5952/s 245 MiB/s 0 0 00:07:18.628 ==================================================================================== 00:07:18.628 Total 5952/s 631 MiB/s 0 0' 00:07:18.628 05:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:18.628 05:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:18.628 05:32:29 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:18.628 05:32:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:18.628 05:32:29 -- accel/accel.sh@12 -- # build_accel_config 00:07:18.628 05:32:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:18.628 05:32:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.628 05:32:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.628 05:32:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:18.628 05:32:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:18.628 05:32:29 -- accel/accel.sh@41 -- # local IFS=, 00:07:18.628 05:32:29 -- accel/accel.sh@42 -- # jq -r . 00:07:18.886 [2024-11-29 05:32:29.936885] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:18.886 [2024-11-29 05:32:29.936975] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2209527 ] 00:07:18.886 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.886 [2024-11-29 05:32:30.005010] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.886 [2024-11-29 05:32:30.042323] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val= 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val= 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val= 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val=0x1 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val= 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val= 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val=decompress 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val= 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val=software 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@23 -- # accel_module=software 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val=32 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val=32 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val=1 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val=Yes 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val= 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:18.886 05:32:30 -- accel/accel.sh@21 -- # val= 00:07:18.886 05:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:18.886 05:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:20.263 05:32:31 -- accel/accel.sh@21 -- # val= 00:07:20.263 05:32:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # IFS=: 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # read -r var val 00:07:20.263 05:32:31 -- accel/accel.sh@21 -- # val= 00:07:20.263 05:32:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # IFS=: 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # read -r var val 00:07:20.263 05:32:31 -- accel/accel.sh@21 -- # val= 00:07:20.263 05:32:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # IFS=: 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # read -r var val 00:07:20.263 05:32:31 -- accel/accel.sh@21 -- # val= 00:07:20.263 05:32:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # IFS=: 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # read -r var val 00:07:20.263 05:32:31 -- accel/accel.sh@21 -- # val= 00:07:20.263 05:32:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # IFS=: 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # read -r var val 00:07:20.263 05:32:31 -- accel/accel.sh@21 -- # val= 00:07:20.263 05:32:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # IFS=: 00:07:20.263 05:32:31 -- accel/accel.sh@20 -- # read -r var val 00:07:20.263 05:32:31 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:20.263 05:32:31 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:20.263 05:32:31 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:20.263 00:07:20.263 real 0m2.604s 00:07:20.263 user 0m2.359s 00:07:20.263 sys 0m0.253s 00:07:20.263 05:32:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:20.263 05:32:31 -- common/autotest_common.sh@10 -- # set +x 00:07:20.263 ************************************ 00:07:20.263 END TEST accel_decmop_full 00:07:20.263 ************************************ 00:07:20.263 05:32:31 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.263 05:32:31 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:20.263 05:32:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:20.263 05:32:31 -- common/autotest_common.sh@10 -- # set +x 00:07:20.263 ************************************ 00:07:20.263 START TEST accel_decomp_mcore 00:07:20.263 ************************************ 00:07:20.263 05:32:31 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.263 05:32:31 -- accel/accel.sh@16 -- # local accel_opc 00:07:20.263 05:32:31 -- accel/accel.sh@17 -- # local accel_module 00:07:20.263 05:32:31 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.263 05:32:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.263 05:32:31 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.263 05:32:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:20.263 05:32:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.263 05:32:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.263 05:32:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:20.263 05:32:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:20.263 05:32:31 -- accel/accel.sh@41 -- # local IFS=, 00:07:20.263 05:32:31 -- accel/accel.sh@42 -- # jq -r . 00:07:20.264 [2024-11-29 05:32:31.291007] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:20.264 [2024-11-29 05:32:31.291100] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2209756 ] 00:07:20.264 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.264 [2024-11-29 05:32:31.360327] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:20.264 [2024-11-29 05:32:31.398943] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:20.264 [2024-11-29 05:32:31.399061] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:20.264 [2024-11-29 05:32:31.399151] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:20.264 [2024-11-29 05:32:31.399153] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.643 05:32:32 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:21.643 00:07:21.643 SPDK Configuration: 00:07:21.644 Core mask: 0xf 00:07:21.644 00:07:21.644 Accel Perf Configuration: 00:07:21.644 Workload Type: decompress 00:07:21.644 Transfer size: 4096 bytes 00:07:21.644 Vector count 1 00:07:21.644 Module: software 00:07:21.644 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:21.644 Queue depth: 32 00:07:21.644 Allocate depth: 32 00:07:21.644 # threads/core: 1 00:07:21.644 Run time: 1 seconds 00:07:21.644 Verify: Yes 00:07:21.644 00:07:21.644 Running for 1 seconds... 00:07:21.644 00:07:21.644 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:21.644 ------------------------------------------------------------------------------------ 00:07:21.644 0,0 77920/s 143 MiB/s 0 0 00:07:21.644 3,0 78464/s 144 MiB/s 0 0 00:07:21.644 2,0 78048/s 143 MiB/s 0 0 00:07:21.644 1,0 77888/s 143 MiB/s 0 0 00:07:21.644 ==================================================================================== 00:07:21.644 Total 312320/s 1220 MiB/s 0 0' 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:21.644 05:32:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:21.644 05:32:32 -- accel/accel.sh@12 -- # build_accel_config 00:07:21.644 05:32:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:21.644 05:32:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.644 05:32:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.644 05:32:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:21.644 05:32:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:21.644 05:32:32 -- accel/accel.sh@41 -- # local IFS=, 00:07:21.644 05:32:32 -- accel/accel.sh@42 -- # jq -r . 00:07:21.644 [2024-11-29 05:32:32.591978] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:21.644 [2024-11-29 05:32:32.592066] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2210035 ] 00:07:21.644 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.644 [2024-11-29 05:32:32.659764] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:21.644 [2024-11-29 05:32:32.696758] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:21.644 [2024-11-29 05:32:32.696856] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:21.644 [2024-11-29 05:32:32.696946] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:21.644 [2024-11-29 05:32:32.696949] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val=0xf 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val=decompress 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val=software 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@23 -- # accel_module=software 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val=32 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val=32 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val=1 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val=Yes 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.644 05:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.644 05:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.644 05:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:22.583 05:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.583 05:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.583 05:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.583 05:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.583 05:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.583 05:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.583 05:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.583 05:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.583 05:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.583 05:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.583 05:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.583 05:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.583 05:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.583 05:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.583 05:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.583 05:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.583 05:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.583 05:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.583 05:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.583 05:32:33 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:22.583 05:32:33 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:22.583 05:32:33 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:22.583 00:07:22.583 real 0m2.608s 00:07:22.583 user 0m9.000s 00:07:22.583 sys 0m0.269s 00:07:22.583 05:32:33 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:22.583 05:32:33 -- common/autotest_common.sh@10 -- # set +x 00:07:22.583 ************************************ 00:07:22.583 END TEST accel_decomp_mcore 00:07:22.583 ************************************ 00:07:22.843 05:32:33 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.843 05:32:33 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:22.843 05:32:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:22.843 05:32:33 -- common/autotest_common.sh@10 -- # set +x 00:07:22.843 ************************************ 00:07:22.843 START TEST accel_decomp_full_mcore 00:07:22.843 ************************************ 00:07:22.843 05:32:33 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.843 05:32:33 -- accel/accel.sh@16 -- # local accel_opc 00:07:22.843 05:32:33 -- accel/accel.sh@17 -- # local accel_module 00:07:22.843 05:32:33 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.843 05:32:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.843 05:32:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:22.843 05:32:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:22.843 05:32:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.843 05:32:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.843 05:32:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:22.843 05:32:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:22.843 05:32:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:22.843 05:32:33 -- accel/accel.sh@42 -- # jq -r . 00:07:22.843 [2024-11-29 05:32:33.948512] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:22.843 [2024-11-29 05:32:33.948621] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2210322 ] 00:07:22.843 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.843 [2024-11-29 05:32:34.019277] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:22.843 [2024-11-29 05:32:34.058210] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.843 [2024-11-29 05:32:34.058306] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:22.843 [2024-11-29 05:32:34.058391] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:22.843 [2024-11-29 05:32:34.058393] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.223 05:32:35 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:24.223 00:07:24.223 SPDK Configuration: 00:07:24.223 Core mask: 0xf 00:07:24.223 00:07:24.223 Accel Perf Configuration: 00:07:24.223 Workload Type: decompress 00:07:24.223 Transfer size: 111250 bytes 00:07:24.223 Vector count 1 00:07:24.223 Module: software 00:07:24.223 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.223 Queue depth: 32 00:07:24.223 Allocate depth: 32 00:07:24.223 # threads/core: 1 00:07:24.223 Run time: 1 seconds 00:07:24.223 Verify: Yes 00:07:24.223 00:07:24.223 Running for 1 seconds... 00:07:24.223 00:07:24.223 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:24.223 ------------------------------------------------------------------------------------ 00:07:24.223 0,0 5792/s 239 MiB/s 0 0 00:07:24.223 3,0 5824/s 240 MiB/s 0 0 00:07:24.223 2,0 5824/s 240 MiB/s 0 0 00:07:24.223 1,0 5824/s 240 MiB/s 0 0 00:07:24.223 ==================================================================================== 00:07:24.223 Total 23264/s 2468 MiB/s 0 0' 00:07:24.223 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.223 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.223 05:32:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:24.223 05:32:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:24.223 05:32:35 -- accel/accel.sh@12 -- # build_accel_config 00:07:24.223 05:32:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:24.223 05:32:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.223 05:32:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.223 05:32:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:24.223 05:32:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:24.223 05:32:35 -- accel/accel.sh@41 -- # local IFS=, 00:07:24.223 05:32:35 -- accel/accel.sh@42 -- # jq -r . 00:07:24.223 [2024-11-29 05:32:35.258724] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:24.223 [2024-11-29 05:32:35.258834] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2210591 ] 00:07:24.223 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.223 [2024-11-29 05:32:35.327290] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:24.223 [2024-11-29 05:32:35.364109] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:24.223 [2024-11-29 05:32:35.364206] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:24.223 [2024-11-29 05:32:35.364292] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:24.223 [2024-11-29 05:32:35.364293] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.223 05:32:35 -- accel/accel.sh@21 -- # val= 00:07:24.223 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.223 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.223 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.223 05:32:35 -- accel/accel.sh@21 -- # val= 00:07:24.223 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val= 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val=0xf 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val= 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val= 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val=decompress 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val= 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val=software 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@23 -- # accel_module=software 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val=32 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val=32 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val=1 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val=Yes 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val= 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:24.224 05:32:35 -- accel/accel.sh@21 -- # val= 00:07:24.224 05:32:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # IFS=: 00:07:24.224 05:32:35 -- accel/accel.sh@20 -- # read -r var val 00:07:25.605 05:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.605 05:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.605 05:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.605 05:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.605 05:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.605 05:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.605 05:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.605 05:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.605 05:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.605 05:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.605 05:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.605 05:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.605 05:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.605 05:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.605 05:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.605 05:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.605 05:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.605 05:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.605 05:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.605 05:32:36 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:25.605 05:32:36 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:25.605 05:32:36 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:25.605 00:07:25.605 real 0m2.627s 00:07:25.605 user 0m9.050s 00:07:25.605 sys 0m0.284s 00:07:25.605 05:32:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:25.605 05:32:36 -- common/autotest_common.sh@10 -- # set +x 00:07:25.605 ************************************ 00:07:25.605 END TEST accel_decomp_full_mcore 00:07:25.605 ************************************ 00:07:25.605 05:32:36 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:25.605 05:32:36 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:25.605 05:32:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:25.605 05:32:36 -- common/autotest_common.sh@10 -- # set +x 00:07:25.605 ************************************ 00:07:25.605 START TEST accel_decomp_mthread 00:07:25.605 ************************************ 00:07:25.605 05:32:36 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:25.605 05:32:36 -- accel/accel.sh@16 -- # local accel_opc 00:07:25.605 05:32:36 -- accel/accel.sh@17 -- # local accel_module 00:07:25.605 05:32:36 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:25.605 05:32:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:25.605 05:32:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:25.605 05:32:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:25.605 05:32:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.605 05:32:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.605 05:32:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:25.605 05:32:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:25.605 05:32:36 -- accel/accel.sh@41 -- # local IFS=, 00:07:25.605 05:32:36 -- accel/accel.sh@42 -- # jq -r . 00:07:25.605 [2024-11-29 05:32:36.624939] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:25.605 [2024-11-29 05:32:36.625030] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2210882 ] 00:07:25.605 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.605 [2024-11-29 05:32:36.692374] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.605 [2024-11-29 05:32:36.727379] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.985 05:32:37 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:26.985 00:07:26.985 SPDK Configuration: 00:07:26.985 Core mask: 0x1 00:07:26.985 00:07:26.985 Accel Perf Configuration: 00:07:26.985 Workload Type: decompress 00:07:26.985 Transfer size: 4096 bytes 00:07:26.985 Vector count 1 00:07:26.985 Module: software 00:07:26.985 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:26.985 Queue depth: 32 00:07:26.985 Allocate depth: 32 00:07:26.985 # threads/core: 2 00:07:26.985 Run time: 1 seconds 00:07:26.985 Verify: Yes 00:07:26.985 00:07:26.985 Running for 1 seconds... 00:07:26.985 00:07:26.985 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:26.985 ------------------------------------------------------------------------------------ 00:07:26.985 0,1 47584/s 87 MiB/s 0 0 00:07:26.985 0,0 47424/s 87 MiB/s 0 0 00:07:26.985 ==================================================================================== 00:07:26.985 Total 95008/s 371 MiB/s 0 0' 00:07:26.985 05:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.985 05:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.985 05:32:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:26.985 05:32:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:26.985 05:32:37 -- accel/accel.sh@12 -- # build_accel_config 00:07:26.986 05:32:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:26.986 05:32:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.986 05:32:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.986 05:32:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:26.986 05:32:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:26.986 05:32:37 -- accel/accel.sh@41 -- # local IFS=, 00:07:26.986 05:32:37 -- accel/accel.sh@42 -- # jq -r . 00:07:26.986 [2024-11-29 05:32:37.913541] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:26.986 [2024-11-29 05:32:37.913646] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2211062 ] 00:07:26.986 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.986 [2024-11-29 05:32:37.980616] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.986 [2024-11-29 05:32:38.014995] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val= 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val= 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val= 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val=0x1 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val= 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val= 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val=decompress 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val= 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val=software 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@23 -- # accel_module=software 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val=32 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val=32 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val=2 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val=Yes 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val= 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:26.986 05:32:38 -- accel/accel.sh@21 -- # val= 00:07:26.986 05:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:26.986 05:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:27.924 05:32:39 -- accel/accel.sh@21 -- # val= 00:07:27.924 05:32:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # IFS=: 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # read -r var val 00:07:27.924 05:32:39 -- accel/accel.sh@21 -- # val= 00:07:27.924 05:32:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # IFS=: 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # read -r var val 00:07:27.924 05:32:39 -- accel/accel.sh@21 -- # val= 00:07:27.924 05:32:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # IFS=: 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # read -r var val 00:07:27.924 05:32:39 -- accel/accel.sh@21 -- # val= 00:07:27.924 05:32:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # IFS=: 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # read -r var val 00:07:27.924 05:32:39 -- accel/accel.sh@21 -- # val= 00:07:27.924 05:32:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # IFS=: 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # read -r var val 00:07:27.924 05:32:39 -- accel/accel.sh@21 -- # val= 00:07:27.924 05:32:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # IFS=: 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # read -r var val 00:07:27.924 05:32:39 -- accel/accel.sh@21 -- # val= 00:07:27.924 05:32:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # IFS=: 00:07:27.924 05:32:39 -- accel/accel.sh@20 -- # read -r var val 00:07:27.924 05:32:39 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:27.924 05:32:39 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:27.924 05:32:39 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:27.924 00:07:27.924 real 0m2.587s 00:07:27.924 user 0m2.335s 00:07:27.924 sys 0m0.261s 00:07:27.925 05:32:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:27.925 05:32:39 -- common/autotest_common.sh@10 -- # set +x 00:07:27.925 ************************************ 00:07:27.925 END TEST accel_decomp_mthread 00:07:27.925 ************************************ 00:07:28.184 05:32:39 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:28.184 05:32:39 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:28.184 05:32:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:28.184 05:32:39 -- common/autotest_common.sh@10 -- # set +x 00:07:28.184 ************************************ 00:07:28.184 START TEST accel_deomp_full_mthread 00:07:28.184 ************************************ 00:07:28.184 05:32:39 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:28.184 05:32:39 -- accel/accel.sh@16 -- # local accel_opc 00:07:28.184 05:32:39 -- accel/accel.sh@17 -- # local accel_module 00:07:28.184 05:32:39 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:28.184 05:32:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:28.184 05:32:39 -- accel/accel.sh@12 -- # build_accel_config 00:07:28.184 05:32:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:28.184 05:32:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.184 05:32:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.184 05:32:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:28.184 05:32:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:28.184 05:32:39 -- accel/accel.sh@41 -- # local IFS=, 00:07:28.184 05:32:39 -- accel/accel.sh@42 -- # jq -r . 00:07:28.184 [2024-11-29 05:32:39.259987] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:28.184 [2024-11-29 05:32:39.260081] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2211248 ] 00:07:28.184 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.184 [2024-11-29 05:32:39.328777] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.184 [2024-11-29 05:32:39.364201] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.564 05:32:40 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:29.564 00:07:29.564 SPDK Configuration: 00:07:29.564 Core mask: 0x1 00:07:29.564 00:07:29.564 Accel Perf Configuration: 00:07:29.564 Workload Type: decompress 00:07:29.564 Transfer size: 111250 bytes 00:07:29.564 Vector count 1 00:07:29.564 Module: software 00:07:29.564 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:29.564 Queue depth: 32 00:07:29.564 Allocate depth: 32 00:07:29.564 # threads/core: 2 00:07:29.564 Run time: 1 seconds 00:07:29.564 Verify: Yes 00:07:29.564 00:07:29.564 Running for 1 seconds... 00:07:29.564 00:07:29.564 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:29.564 ------------------------------------------------------------------------------------ 00:07:29.564 0,1 2912/s 120 MiB/s 0 0 00:07:29.564 0,0 2912/s 120 MiB/s 0 0 00:07:29.564 ==================================================================================== 00:07:29.564 Total 5824/s 617 MiB/s 0 0' 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:29.564 05:32:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:29.564 05:32:40 -- accel/accel.sh@12 -- # build_accel_config 00:07:29.564 05:32:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:29.564 05:32:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.564 05:32:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.564 05:32:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:29.564 05:32:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:29.564 05:32:40 -- accel/accel.sh@41 -- # local IFS=, 00:07:29.564 05:32:40 -- accel/accel.sh@42 -- # jq -r . 00:07:29.564 [2024-11-29 05:32:40.575078] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:29.564 [2024-11-29 05:32:40.575172] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2211454 ] 00:07:29.564 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.564 [2024-11-29 05:32:40.645390] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.564 [2024-11-29 05:32:40.680369] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val=0x1 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val=decompress 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val=software 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@23 -- # accel_module=software 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val=32 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val=32 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val=2 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.564 05:32:40 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:29.564 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.564 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.565 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.565 05:32:40 -- accel/accel.sh@21 -- # val=Yes 00:07:29.565 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.565 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.565 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.565 05:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.565 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.565 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.565 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.565 05:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.565 05:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.565 05:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.565 05:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:30.944 05:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.944 05:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.944 05:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.944 05:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.944 05:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.944 05:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.944 05:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.944 05:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.944 05:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.944 05:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.944 05:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.944 05:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.944 05:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.944 05:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.944 05:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.944 05:32:41 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:30.944 05:32:41 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:30.944 05:32:41 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:30.944 00:07:30.944 real 0m2.634s 00:07:30.944 user 0m2.377s 00:07:30.944 sys 0m0.262s 00:07:30.944 05:32:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:30.944 05:32:41 -- common/autotest_common.sh@10 -- # set +x 00:07:30.944 ************************************ 00:07:30.944 END TEST accel_deomp_full_mthread 00:07:30.944 ************************************ 00:07:30.944 05:32:41 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:30.944 05:32:41 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:30.944 05:32:41 -- accel/accel.sh@129 -- # build_accel_config 00:07:30.944 05:32:41 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:07:30.944 05:32:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:30.944 05:32:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:30.944 05:32:41 -- common/autotest_common.sh@10 -- # set +x 00:07:30.944 05:32:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:30.944 05:32:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:30.944 05:32:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:30.944 05:32:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:30.944 05:32:41 -- accel/accel.sh@41 -- # local IFS=, 00:07:30.944 05:32:41 -- accel/accel.sh@42 -- # jq -r . 00:07:30.944 ************************************ 00:07:30.944 START TEST accel_dif_functional_tests 00:07:30.944 ************************************ 00:07:30.944 05:32:41 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:30.944 [2024-11-29 05:32:41.936689] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:30.944 [2024-11-29 05:32:41.936757] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2211747 ] 00:07:30.944 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.944 [2024-11-29 05:32:41.995037] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:30.944 [2024-11-29 05:32:42.032011] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:30.944 [2024-11-29 05:32:42.032108] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.944 [2024-11-29 05:32:42.032108] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:30.944 00:07:30.944 00:07:30.944 CUnit - A unit testing framework for C - Version 2.1-3 00:07:30.944 http://cunit.sourceforge.net/ 00:07:30.944 00:07:30.944 00:07:30.944 Suite: accel_dif 00:07:30.944 Test: verify: DIF generated, GUARD check ...passed 00:07:30.944 Test: verify: DIF generated, APPTAG check ...passed 00:07:30.944 Test: verify: DIF generated, REFTAG check ...passed 00:07:30.944 Test: verify: DIF not generated, GUARD check ...[2024-11-29 05:32:42.096044] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:30.944 [2024-11-29 05:32:42.096096] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:30.944 passed 00:07:30.944 Test: verify: DIF not generated, APPTAG check ...[2024-11-29 05:32:42.096146] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:30.944 [2024-11-29 05:32:42.096165] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:30.944 passed 00:07:30.944 Test: verify: DIF not generated, REFTAG check ...[2024-11-29 05:32:42.096188] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:30.944 [2024-11-29 05:32:42.096206] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:30.944 passed 00:07:30.944 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:30.944 Test: verify: APPTAG incorrect, APPTAG check ...[2024-11-29 05:32:42.096257] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:30.944 passed 00:07:30.944 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:30.944 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:30.944 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:30.944 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-11-29 05:32:42.096355] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:30.944 passed 00:07:30.944 Test: generate copy: DIF generated, GUARD check ...passed 00:07:30.944 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:30.944 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:30.944 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:30.944 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:30.944 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:30.944 Test: generate copy: iovecs-len validate ...[2024-11-29 05:32:42.096545] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:30.944 passed 00:07:30.944 Test: generate copy: buffer alignment validate ...passed 00:07:30.944 00:07:30.944 Run Summary: Type Total Ran Passed Failed Inactive 00:07:30.945 suites 1 1 n/a 0 0 00:07:30.945 tests 20 20 20 0 0 00:07:30.945 asserts 204 204 204 0 n/a 00:07:30.945 00:07:30.945 Elapsed time = 0.000 seconds 00:07:31.204 00:07:31.204 real 0m0.326s 00:07:31.204 user 0m0.526s 00:07:31.204 sys 0m0.141s 00:07:31.204 05:32:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:31.204 05:32:42 -- common/autotest_common.sh@10 -- # set +x 00:07:31.204 ************************************ 00:07:31.204 END TEST accel_dif_functional_tests 00:07:31.204 ************************************ 00:07:31.204 00:07:31.204 real 0m55.326s 00:07:31.204 user 1m2.979s 00:07:31.204 sys 0m7.026s 00:07:31.204 05:32:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:31.204 05:32:42 -- common/autotest_common.sh@10 -- # set +x 00:07:31.204 ************************************ 00:07:31.204 END TEST accel 00:07:31.204 ************************************ 00:07:31.204 05:32:42 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:31.204 05:32:42 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:31.204 05:32:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:31.204 05:32:42 -- common/autotest_common.sh@10 -- # set +x 00:07:31.204 ************************************ 00:07:31.204 START TEST accel_rpc 00:07:31.204 ************************************ 00:07:31.204 05:32:42 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:31.204 * Looking for test storage... 00:07:31.204 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:31.204 05:32:42 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:31.204 05:32:42 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:31.204 05:32:42 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:31.204 05:32:42 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:31.464 05:32:42 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:31.464 05:32:42 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:31.464 05:32:42 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:31.464 05:32:42 -- scripts/common.sh@335 -- # IFS=.-: 00:07:31.464 05:32:42 -- scripts/common.sh@335 -- # read -ra ver1 00:07:31.464 05:32:42 -- scripts/common.sh@336 -- # IFS=.-: 00:07:31.464 05:32:42 -- scripts/common.sh@336 -- # read -ra ver2 00:07:31.464 05:32:42 -- scripts/common.sh@337 -- # local 'op=<' 00:07:31.464 05:32:42 -- scripts/common.sh@339 -- # ver1_l=2 00:07:31.464 05:32:42 -- scripts/common.sh@340 -- # ver2_l=1 00:07:31.464 05:32:42 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:31.464 05:32:42 -- scripts/common.sh@343 -- # case "$op" in 00:07:31.464 05:32:42 -- scripts/common.sh@344 -- # : 1 00:07:31.464 05:32:42 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:31.464 05:32:42 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:31.464 05:32:42 -- scripts/common.sh@364 -- # decimal 1 00:07:31.464 05:32:42 -- scripts/common.sh@352 -- # local d=1 00:07:31.464 05:32:42 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:31.464 05:32:42 -- scripts/common.sh@354 -- # echo 1 00:07:31.464 05:32:42 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:31.464 05:32:42 -- scripts/common.sh@365 -- # decimal 2 00:07:31.464 05:32:42 -- scripts/common.sh@352 -- # local d=2 00:07:31.464 05:32:42 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:31.464 05:32:42 -- scripts/common.sh@354 -- # echo 2 00:07:31.464 05:32:42 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:31.464 05:32:42 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:31.464 05:32:42 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:31.464 05:32:42 -- scripts/common.sh@367 -- # return 0 00:07:31.464 05:32:42 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:31.464 05:32:42 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:31.464 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.464 --rc genhtml_branch_coverage=1 00:07:31.464 --rc genhtml_function_coverage=1 00:07:31.464 --rc genhtml_legend=1 00:07:31.464 --rc geninfo_all_blocks=1 00:07:31.464 --rc geninfo_unexecuted_blocks=1 00:07:31.464 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.464 ' 00:07:31.464 05:32:42 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:31.464 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.464 --rc genhtml_branch_coverage=1 00:07:31.464 --rc genhtml_function_coverage=1 00:07:31.464 --rc genhtml_legend=1 00:07:31.464 --rc geninfo_all_blocks=1 00:07:31.464 --rc geninfo_unexecuted_blocks=1 00:07:31.464 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.464 ' 00:07:31.464 05:32:42 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:31.464 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.464 --rc genhtml_branch_coverage=1 00:07:31.464 --rc genhtml_function_coverage=1 00:07:31.464 --rc genhtml_legend=1 00:07:31.464 --rc geninfo_all_blocks=1 00:07:31.464 --rc geninfo_unexecuted_blocks=1 00:07:31.464 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.464 ' 00:07:31.464 05:32:42 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:31.464 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.464 --rc genhtml_branch_coverage=1 00:07:31.464 --rc genhtml_function_coverage=1 00:07:31.464 --rc genhtml_legend=1 00:07:31.464 --rc geninfo_all_blocks=1 00:07:31.464 --rc geninfo_unexecuted_blocks=1 00:07:31.464 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.464 ' 00:07:31.464 05:32:42 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:31.464 05:32:42 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2211967 00:07:31.464 05:32:42 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:31.465 05:32:42 -- accel/accel_rpc.sh@15 -- # waitforlisten 2211967 00:07:31.465 05:32:42 -- common/autotest_common.sh@829 -- # '[' -z 2211967 ']' 00:07:31.465 05:32:42 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.465 05:32:42 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:31.465 05:32:42 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.465 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.465 05:32:42 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:31.465 05:32:42 -- common/autotest_common.sh@10 -- # set +x 00:07:31.465 [2024-11-29 05:32:42.550274] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:31.465 [2024-11-29 05:32:42.550368] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2211967 ] 00:07:31.465 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.465 [2024-11-29 05:32:42.618801] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.465 [2024-11-29 05:32:42.656167] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:31.465 [2024-11-29 05:32:42.656287] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.465 05:32:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:31.465 05:32:42 -- common/autotest_common.sh@862 -- # return 0 00:07:31.465 05:32:42 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:31.465 05:32:42 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:31.465 05:32:42 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:31.465 05:32:42 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:31.465 05:32:42 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:31.465 05:32:42 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:31.465 05:32:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:31.465 05:32:42 -- common/autotest_common.sh@10 -- # set +x 00:07:31.465 ************************************ 00:07:31.465 START TEST accel_assign_opcode 00:07:31.465 ************************************ 00:07:31.465 05:32:42 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:07:31.465 05:32:42 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:31.465 05:32:42 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:31.465 05:32:42 -- common/autotest_common.sh@10 -- # set +x 00:07:31.465 [2024-11-29 05:32:42.720762] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:31.465 05:32:42 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:31.465 05:32:42 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:31.465 05:32:42 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:31.465 05:32:42 -- common/autotest_common.sh@10 -- # set +x 00:07:31.465 [2024-11-29 05:32:42.728778] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:31.465 05:32:42 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:31.465 05:32:42 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:31.465 05:32:42 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:31.465 05:32:42 -- common/autotest_common.sh@10 -- # set +x 00:07:31.724 05:32:42 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:31.724 05:32:42 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:31.724 05:32:42 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:31.724 05:32:42 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:31.724 05:32:42 -- common/autotest_common.sh@10 -- # set +x 00:07:31.724 05:32:42 -- accel/accel_rpc.sh@42 -- # grep software 00:07:31.724 05:32:42 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:31.724 software 00:07:31.724 00:07:31.724 real 0m0.219s 00:07:31.724 user 0m0.043s 00:07:31.724 sys 0m0.014s 00:07:31.724 05:32:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:31.724 05:32:42 -- common/autotest_common.sh@10 -- # set +x 00:07:31.724 ************************************ 00:07:31.724 END TEST accel_assign_opcode 00:07:31.724 ************************************ 00:07:31.724 05:32:42 -- accel/accel_rpc.sh@55 -- # killprocess 2211967 00:07:31.724 05:32:42 -- common/autotest_common.sh@936 -- # '[' -z 2211967 ']' 00:07:31.724 05:32:42 -- common/autotest_common.sh@940 -- # kill -0 2211967 00:07:31.724 05:32:42 -- common/autotest_common.sh@941 -- # uname 00:07:31.724 05:32:42 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:31.724 05:32:42 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2211967 00:07:31.984 05:32:43 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:31.984 05:32:43 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:31.984 05:32:43 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2211967' 00:07:31.984 killing process with pid 2211967 00:07:31.984 05:32:43 -- common/autotest_common.sh@955 -- # kill 2211967 00:07:31.984 05:32:43 -- common/autotest_common.sh@960 -- # wait 2211967 00:07:32.243 00:07:32.243 real 0m0.993s 00:07:32.243 user 0m0.878s 00:07:32.243 sys 0m0.473s 00:07:32.243 05:32:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:32.243 05:32:43 -- common/autotest_common.sh@10 -- # set +x 00:07:32.243 ************************************ 00:07:32.243 END TEST accel_rpc 00:07:32.243 ************************************ 00:07:32.243 05:32:43 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:32.243 05:32:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:32.243 05:32:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:32.243 05:32:43 -- common/autotest_common.sh@10 -- # set +x 00:07:32.243 ************************************ 00:07:32.243 START TEST app_cmdline 00:07:32.243 ************************************ 00:07:32.243 05:32:43 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:32.243 * Looking for test storage... 00:07:32.243 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:32.243 05:32:43 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:32.243 05:32:43 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:32.243 05:32:43 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:32.504 05:32:43 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:32.504 05:32:43 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:32.504 05:32:43 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:32.504 05:32:43 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:32.504 05:32:43 -- scripts/common.sh@335 -- # IFS=.-: 00:07:32.504 05:32:43 -- scripts/common.sh@335 -- # read -ra ver1 00:07:32.504 05:32:43 -- scripts/common.sh@336 -- # IFS=.-: 00:07:32.504 05:32:43 -- scripts/common.sh@336 -- # read -ra ver2 00:07:32.504 05:32:43 -- scripts/common.sh@337 -- # local 'op=<' 00:07:32.504 05:32:43 -- scripts/common.sh@339 -- # ver1_l=2 00:07:32.504 05:32:43 -- scripts/common.sh@340 -- # ver2_l=1 00:07:32.504 05:32:43 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:32.504 05:32:43 -- scripts/common.sh@343 -- # case "$op" in 00:07:32.504 05:32:43 -- scripts/common.sh@344 -- # : 1 00:07:32.504 05:32:43 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:32.504 05:32:43 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:32.504 05:32:43 -- scripts/common.sh@364 -- # decimal 1 00:07:32.504 05:32:43 -- scripts/common.sh@352 -- # local d=1 00:07:32.504 05:32:43 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:32.504 05:32:43 -- scripts/common.sh@354 -- # echo 1 00:07:32.504 05:32:43 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:32.504 05:32:43 -- scripts/common.sh@365 -- # decimal 2 00:07:32.504 05:32:43 -- scripts/common.sh@352 -- # local d=2 00:07:32.504 05:32:43 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:32.504 05:32:43 -- scripts/common.sh@354 -- # echo 2 00:07:32.504 05:32:43 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:32.504 05:32:43 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:32.504 05:32:43 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:32.504 05:32:43 -- scripts/common.sh@367 -- # return 0 00:07:32.504 05:32:43 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:32.504 05:32:43 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:32.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.504 --rc genhtml_branch_coverage=1 00:07:32.504 --rc genhtml_function_coverage=1 00:07:32.504 --rc genhtml_legend=1 00:07:32.504 --rc geninfo_all_blocks=1 00:07:32.504 --rc geninfo_unexecuted_blocks=1 00:07:32.504 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.504 ' 00:07:32.504 05:32:43 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:32.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.504 --rc genhtml_branch_coverage=1 00:07:32.504 --rc genhtml_function_coverage=1 00:07:32.504 --rc genhtml_legend=1 00:07:32.504 --rc geninfo_all_blocks=1 00:07:32.504 --rc geninfo_unexecuted_blocks=1 00:07:32.504 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.504 ' 00:07:32.504 05:32:43 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:32.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.504 --rc genhtml_branch_coverage=1 00:07:32.504 --rc genhtml_function_coverage=1 00:07:32.504 --rc genhtml_legend=1 00:07:32.504 --rc geninfo_all_blocks=1 00:07:32.504 --rc geninfo_unexecuted_blocks=1 00:07:32.504 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.504 ' 00:07:32.504 05:32:43 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:32.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.504 --rc genhtml_branch_coverage=1 00:07:32.504 --rc genhtml_function_coverage=1 00:07:32.504 --rc genhtml_legend=1 00:07:32.504 --rc geninfo_all_blocks=1 00:07:32.504 --rc geninfo_unexecuted_blocks=1 00:07:32.504 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.504 ' 00:07:32.504 05:32:43 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:32.504 05:32:43 -- app/cmdline.sh@17 -- # spdk_tgt_pid=2212152 00:07:32.504 05:32:43 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:32.504 05:32:43 -- app/cmdline.sh@18 -- # waitforlisten 2212152 00:07:32.504 05:32:43 -- common/autotest_common.sh@829 -- # '[' -z 2212152 ']' 00:07:32.504 05:32:43 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:32.504 05:32:43 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:32.504 05:32:43 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:32.504 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:32.504 05:32:43 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:32.504 05:32:43 -- common/autotest_common.sh@10 -- # set +x 00:07:32.504 [2024-11-29 05:32:43.599225] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:32.504 [2024-11-29 05:32:43.599297] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2212152 ] 00:07:32.504 EAL: No free 2048 kB hugepages reported on node 1 00:07:32.504 [2024-11-29 05:32:43.660712] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.504 [2024-11-29 05:32:43.698290] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:32.504 [2024-11-29 05:32:43.698413] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.442 05:32:44 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:33.442 05:32:44 -- common/autotest_common.sh@862 -- # return 0 00:07:33.442 05:32:44 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:33.442 { 00:07:33.442 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:07:33.442 "fields": { 00:07:33.442 "major": 24, 00:07:33.442 "minor": 1, 00:07:33.442 "patch": 1, 00:07:33.442 "suffix": "-pre", 00:07:33.442 "commit": "c13c99a5e" 00:07:33.442 } 00:07:33.442 } 00:07:33.442 05:32:44 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:33.442 05:32:44 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:33.442 05:32:44 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:33.442 05:32:44 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:33.442 05:32:44 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:33.442 05:32:44 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:33.442 05:32:44 -- app/cmdline.sh@26 -- # sort 00:07:33.442 05:32:44 -- common/autotest_common.sh@10 -- # set +x 00:07:33.442 05:32:44 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:33.442 05:32:44 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:33.442 05:32:44 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:33.442 05:32:44 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:33.442 05:32:44 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:33.442 05:32:44 -- common/autotest_common.sh@650 -- # local es=0 00:07:33.442 05:32:44 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:33.442 05:32:44 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:33.442 05:32:44 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:33.442 05:32:44 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:33.442 05:32:44 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:33.442 05:32:44 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:33.442 05:32:44 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:33.442 05:32:44 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:33.442 05:32:44 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:33.442 05:32:44 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:33.701 request: 00:07:33.701 { 00:07:33.701 "method": "env_dpdk_get_mem_stats", 00:07:33.701 "req_id": 1 00:07:33.701 } 00:07:33.701 Got JSON-RPC error response 00:07:33.701 response: 00:07:33.701 { 00:07:33.701 "code": -32601, 00:07:33.701 "message": "Method not found" 00:07:33.701 } 00:07:33.701 05:32:44 -- common/autotest_common.sh@653 -- # es=1 00:07:33.701 05:32:44 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:33.701 05:32:44 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:33.701 05:32:44 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:33.701 05:32:44 -- app/cmdline.sh@1 -- # killprocess 2212152 00:07:33.701 05:32:44 -- common/autotest_common.sh@936 -- # '[' -z 2212152 ']' 00:07:33.701 05:32:44 -- common/autotest_common.sh@940 -- # kill -0 2212152 00:07:33.701 05:32:44 -- common/autotest_common.sh@941 -- # uname 00:07:33.701 05:32:44 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:33.701 05:32:44 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 2212152 00:07:33.701 05:32:44 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:33.701 05:32:44 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:33.701 05:32:44 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 2212152' 00:07:33.701 killing process with pid 2212152 00:07:33.701 05:32:44 -- common/autotest_common.sh@955 -- # kill 2212152 00:07:33.701 05:32:44 -- common/autotest_common.sh@960 -- # wait 2212152 00:07:33.959 00:07:33.959 real 0m1.781s 00:07:33.959 user 0m2.049s 00:07:33.959 sys 0m0.503s 00:07:33.959 05:32:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:33.959 05:32:45 -- common/autotest_common.sh@10 -- # set +x 00:07:33.959 ************************************ 00:07:33.959 END TEST app_cmdline 00:07:33.960 ************************************ 00:07:33.960 05:32:45 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:33.960 05:32:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:33.960 05:32:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:33.960 05:32:45 -- common/autotest_common.sh@10 -- # set +x 00:07:33.960 ************************************ 00:07:33.960 START TEST version 00:07:33.960 ************************************ 00:07:33.960 05:32:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:34.218 * Looking for test storage... 00:07:34.218 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:34.218 05:32:45 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:34.218 05:32:45 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:34.218 05:32:45 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:34.218 05:32:45 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:34.218 05:32:45 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:34.218 05:32:45 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:34.218 05:32:45 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:34.218 05:32:45 -- scripts/common.sh@335 -- # IFS=.-: 00:07:34.218 05:32:45 -- scripts/common.sh@335 -- # read -ra ver1 00:07:34.218 05:32:45 -- scripts/common.sh@336 -- # IFS=.-: 00:07:34.218 05:32:45 -- scripts/common.sh@336 -- # read -ra ver2 00:07:34.218 05:32:45 -- scripts/common.sh@337 -- # local 'op=<' 00:07:34.218 05:32:45 -- scripts/common.sh@339 -- # ver1_l=2 00:07:34.218 05:32:45 -- scripts/common.sh@340 -- # ver2_l=1 00:07:34.218 05:32:45 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:34.218 05:32:45 -- scripts/common.sh@343 -- # case "$op" in 00:07:34.218 05:32:45 -- scripts/common.sh@344 -- # : 1 00:07:34.218 05:32:45 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:34.218 05:32:45 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:34.218 05:32:45 -- scripts/common.sh@364 -- # decimal 1 00:07:34.218 05:32:45 -- scripts/common.sh@352 -- # local d=1 00:07:34.218 05:32:45 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:34.218 05:32:45 -- scripts/common.sh@354 -- # echo 1 00:07:34.218 05:32:45 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:34.218 05:32:45 -- scripts/common.sh@365 -- # decimal 2 00:07:34.218 05:32:45 -- scripts/common.sh@352 -- # local d=2 00:07:34.218 05:32:45 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:34.218 05:32:45 -- scripts/common.sh@354 -- # echo 2 00:07:34.218 05:32:45 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:34.218 05:32:45 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:34.218 05:32:45 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:34.218 05:32:45 -- scripts/common.sh@367 -- # return 0 00:07:34.218 05:32:45 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:34.218 05:32:45 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:34.218 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.218 --rc genhtml_branch_coverage=1 00:07:34.218 --rc genhtml_function_coverage=1 00:07:34.218 --rc genhtml_legend=1 00:07:34.218 --rc geninfo_all_blocks=1 00:07:34.218 --rc geninfo_unexecuted_blocks=1 00:07:34.218 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.218 ' 00:07:34.218 05:32:45 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:34.218 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.218 --rc genhtml_branch_coverage=1 00:07:34.218 --rc genhtml_function_coverage=1 00:07:34.218 --rc genhtml_legend=1 00:07:34.218 --rc geninfo_all_blocks=1 00:07:34.218 --rc geninfo_unexecuted_blocks=1 00:07:34.218 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.218 ' 00:07:34.218 05:32:45 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:34.218 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.218 --rc genhtml_branch_coverage=1 00:07:34.218 --rc genhtml_function_coverage=1 00:07:34.218 --rc genhtml_legend=1 00:07:34.218 --rc geninfo_all_blocks=1 00:07:34.218 --rc geninfo_unexecuted_blocks=1 00:07:34.218 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.218 ' 00:07:34.218 05:32:45 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:34.218 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.218 --rc genhtml_branch_coverage=1 00:07:34.218 --rc genhtml_function_coverage=1 00:07:34.218 --rc genhtml_legend=1 00:07:34.218 --rc geninfo_all_blocks=1 00:07:34.218 --rc geninfo_unexecuted_blocks=1 00:07:34.218 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.218 ' 00:07:34.218 05:32:45 -- app/version.sh@17 -- # get_header_version major 00:07:34.218 05:32:45 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:34.218 05:32:45 -- app/version.sh@14 -- # cut -f2 00:07:34.218 05:32:45 -- app/version.sh@14 -- # tr -d '"' 00:07:34.218 05:32:45 -- app/version.sh@17 -- # major=24 00:07:34.218 05:32:45 -- app/version.sh@18 -- # get_header_version minor 00:07:34.218 05:32:45 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:34.218 05:32:45 -- app/version.sh@14 -- # cut -f2 00:07:34.218 05:32:45 -- app/version.sh@14 -- # tr -d '"' 00:07:34.218 05:32:45 -- app/version.sh@18 -- # minor=1 00:07:34.218 05:32:45 -- app/version.sh@19 -- # get_header_version patch 00:07:34.219 05:32:45 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:34.219 05:32:45 -- app/version.sh@14 -- # cut -f2 00:07:34.219 05:32:45 -- app/version.sh@14 -- # tr -d '"' 00:07:34.219 05:32:45 -- app/version.sh@19 -- # patch=1 00:07:34.219 05:32:45 -- app/version.sh@20 -- # get_header_version suffix 00:07:34.219 05:32:45 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:34.219 05:32:45 -- app/version.sh@14 -- # cut -f2 00:07:34.219 05:32:45 -- app/version.sh@14 -- # tr -d '"' 00:07:34.219 05:32:45 -- app/version.sh@20 -- # suffix=-pre 00:07:34.219 05:32:45 -- app/version.sh@22 -- # version=24.1 00:07:34.219 05:32:45 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:34.219 05:32:45 -- app/version.sh@25 -- # version=24.1.1 00:07:34.219 05:32:45 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:34.219 05:32:45 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:34.219 05:32:45 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:34.219 05:32:45 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:34.219 05:32:45 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:34.219 00:07:34.219 real 0m0.258s 00:07:34.219 user 0m0.151s 00:07:34.219 sys 0m0.160s 00:07:34.219 05:32:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:34.219 05:32:45 -- common/autotest_common.sh@10 -- # set +x 00:07:34.219 ************************************ 00:07:34.219 END TEST version 00:07:34.219 ************************************ 00:07:34.219 05:32:45 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:07:34.219 05:32:45 -- spdk/autotest.sh@191 -- # uname -s 00:07:34.219 05:32:45 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:07:34.219 05:32:45 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:34.219 05:32:45 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:34.219 05:32:45 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:07:34.219 05:32:45 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:07:34.219 05:32:45 -- spdk/autotest.sh@255 -- # timing_exit lib 00:07:34.219 05:32:45 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:34.219 05:32:45 -- common/autotest_common.sh@10 -- # set +x 00:07:34.478 05:32:45 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:34.478 05:32:45 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:07:34.478 05:32:45 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:07:34.478 05:32:45 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:07:34.478 05:32:45 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:34.478 05:32:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:34.478 05:32:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:34.478 05:32:45 -- common/autotest_common.sh@10 -- # set +x 00:07:34.478 ************************************ 00:07:34.478 START TEST llvm_fuzz 00:07:34.478 ************************************ 00:07:34.478 05:32:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:34.478 * Looking for test storage... 00:07:34.478 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:34.478 05:32:45 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:34.478 05:32:45 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:34.478 05:32:45 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:34.478 05:32:45 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:34.478 05:32:45 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:34.478 05:32:45 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:34.478 05:32:45 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:34.478 05:32:45 -- scripts/common.sh@335 -- # IFS=.-: 00:07:34.478 05:32:45 -- scripts/common.sh@335 -- # read -ra ver1 00:07:34.478 05:32:45 -- scripts/common.sh@336 -- # IFS=.-: 00:07:34.478 05:32:45 -- scripts/common.sh@336 -- # read -ra ver2 00:07:34.478 05:32:45 -- scripts/common.sh@337 -- # local 'op=<' 00:07:34.478 05:32:45 -- scripts/common.sh@339 -- # ver1_l=2 00:07:34.478 05:32:45 -- scripts/common.sh@340 -- # ver2_l=1 00:07:34.478 05:32:45 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:34.478 05:32:45 -- scripts/common.sh@343 -- # case "$op" in 00:07:34.478 05:32:45 -- scripts/common.sh@344 -- # : 1 00:07:34.478 05:32:45 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:34.478 05:32:45 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:34.478 05:32:45 -- scripts/common.sh@364 -- # decimal 1 00:07:34.478 05:32:45 -- scripts/common.sh@352 -- # local d=1 00:07:34.478 05:32:45 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:34.478 05:32:45 -- scripts/common.sh@354 -- # echo 1 00:07:34.478 05:32:45 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:34.478 05:32:45 -- scripts/common.sh@365 -- # decimal 2 00:07:34.478 05:32:45 -- scripts/common.sh@352 -- # local d=2 00:07:34.478 05:32:45 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:34.478 05:32:45 -- scripts/common.sh@354 -- # echo 2 00:07:34.478 05:32:45 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:34.478 05:32:45 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:34.478 05:32:45 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:34.478 05:32:45 -- scripts/common.sh@367 -- # return 0 00:07:34.478 05:32:45 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:34.478 05:32:45 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:34.478 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.478 --rc genhtml_branch_coverage=1 00:07:34.478 --rc genhtml_function_coverage=1 00:07:34.478 --rc genhtml_legend=1 00:07:34.478 --rc geninfo_all_blocks=1 00:07:34.478 --rc geninfo_unexecuted_blocks=1 00:07:34.478 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.478 ' 00:07:34.478 05:32:45 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:34.478 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.478 --rc genhtml_branch_coverage=1 00:07:34.478 --rc genhtml_function_coverage=1 00:07:34.478 --rc genhtml_legend=1 00:07:34.478 --rc geninfo_all_blocks=1 00:07:34.478 --rc geninfo_unexecuted_blocks=1 00:07:34.478 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.478 ' 00:07:34.478 05:32:45 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:34.478 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.478 --rc genhtml_branch_coverage=1 00:07:34.478 --rc genhtml_function_coverage=1 00:07:34.478 --rc genhtml_legend=1 00:07:34.478 --rc geninfo_all_blocks=1 00:07:34.478 --rc geninfo_unexecuted_blocks=1 00:07:34.478 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.478 ' 00:07:34.478 05:32:45 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:34.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.479 --rc genhtml_branch_coverage=1 00:07:34.479 --rc genhtml_function_coverage=1 00:07:34.479 --rc genhtml_legend=1 00:07:34.479 --rc geninfo_all_blocks=1 00:07:34.479 --rc geninfo_unexecuted_blocks=1 00:07:34.479 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.479 ' 00:07:34.479 05:32:45 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:34.479 05:32:45 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:34.479 05:32:45 -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:34.479 05:32:45 -- common/autotest_common.sh@548 -- # local fuzzers 00:07:34.479 05:32:45 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:34.479 05:32:45 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:34.479 05:32:45 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:34.479 05:32:45 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:34.479 05:32:45 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:34.479 05:32:45 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:34.479 05:32:45 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:34.479 05:32:45 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:34.479 05:32:45 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:34.479 05:32:45 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:34.479 05:32:45 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:34.479 05:32:45 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:34.479 05:32:45 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:34.479 05:32:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:34.479 05:32:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:34.479 05:32:45 -- common/autotest_common.sh@10 -- # set +x 00:07:34.479 ************************************ 00:07:34.479 START TEST nvmf_fuzz 00:07:34.479 ************************************ 00:07:34.479 05:32:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:34.740 * Looking for test storage... 00:07:34.740 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:34.740 05:32:45 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:34.740 05:32:45 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:34.740 05:32:45 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:34.740 05:32:45 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:34.740 05:32:45 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:34.740 05:32:45 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:34.740 05:32:45 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:34.740 05:32:45 -- scripts/common.sh@335 -- # IFS=.-: 00:07:34.740 05:32:45 -- scripts/common.sh@335 -- # read -ra ver1 00:07:34.740 05:32:45 -- scripts/common.sh@336 -- # IFS=.-: 00:07:34.740 05:32:45 -- scripts/common.sh@336 -- # read -ra ver2 00:07:34.740 05:32:45 -- scripts/common.sh@337 -- # local 'op=<' 00:07:34.740 05:32:45 -- scripts/common.sh@339 -- # ver1_l=2 00:07:34.740 05:32:45 -- scripts/common.sh@340 -- # ver2_l=1 00:07:34.740 05:32:45 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:34.740 05:32:45 -- scripts/common.sh@343 -- # case "$op" in 00:07:34.740 05:32:45 -- scripts/common.sh@344 -- # : 1 00:07:34.740 05:32:45 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:34.740 05:32:45 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:34.740 05:32:45 -- scripts/common.sh@364 -- # decimal 1 00:07:34.740 05:32:45 -- scripts/common.sh@352 -- # local d=1 00:07:34.740 05:32:45 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:34.740 05:32:45 -- scripts/common.sh@354 -- # echo 1 00:07:34.740 05:32:45 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:34.740 05:32:45 -- scripts/common.sh@365 -- # decimal 2 00:07:34.740 05:32:45 -- scripts/common.sh@352 -- # local d=2 00:07:34.740 05:32:45 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:34.740 05:32:45 -- scripts/common.sh@354 -- # echo 2 00:07:34.740 05:32:45 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:34.740 05:32:45 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:34.740 05:32:45 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:34.740 05:32:45 -- scripts/common.sh@367 -- # return 0 00:07:34.740 05:32:45 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:34.740 05:32:45 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:34.740 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.740 --rc genhtml_branch_coverage=1 00:07:34.740 --rc genhtml_function_coverage=1 00:07:34.740 --rc genhtml_legend=1 00:07:34.740 --rc geninfo_all_blocks=1 00:07:34.740 --rc geninfo_unexecuted_blocks=1 00:07:34.740 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.740 ' 00:07:34.740 05:32:45 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:34.740 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.740 --rc genhtml_branch_coverage=1 00:07:34.740 --rc genhtml_function_coverage=1 00:07:34.740 --rc genhtml_legend=1 00:07:34.740 --rc geninfo_all_blocks=1 00:07:34.740 --rc geninfo_unexecuted_blocks=1 00:07:34.740 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.740 ' 00:07:34.740 05:32:45 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:34.740 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.740 --rc genhtml_branch_coverage=1 00:07:34.740 --rc genhtml_function_coverage=1 00:07:34.740 --rc genhtml_legend=1 00:07:34.740 --rc geninfo_all_blocks=1 00:07:34.740 --rc geninfo_unexecuted_blocks=1 00:07:34.740 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.740 ' 00:07:34.740 05:32:45 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:34.740 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.740 --rc genhtml_branch_coverage=1 00:07:34.740 --rc genhtml_function_coverage=1 00:07:34.740 --rc genhtml_legend=1 00:07:34.740 --rc geninfo_all_blocks=1 00:07:34.740 --rc geninfo_unexecuted_blocks=1 00:07:34.740 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.740 ' 00:07:34.740 05:32:45 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:34.740 05:32:45 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:34.740 05:32:45 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:34.740 05:32:45 -- common/autotest_common.sh@34 -- # set -e 00:07:34.740 05:32:45 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:34.740 05:32:45 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:34.740 05:32:45 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:34.740 05:32:45 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:34.740 05:32:45 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:34.740 05:32:45 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:34.740 05:32:45 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:34.740 05:32:45 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:34.740 05:32:45 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:34.740 05:32:45 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:34.740 05:32:45 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:34.740 05:32:45 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:34.740 05:32:45 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:34.740 05:32:45 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:34.740 05:32:45 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:34.740 05:32:45 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:34.740 05:32:45 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:34.741 05:32:45 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:34.741 05:32:45 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:34.741 05:32:45 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:34.741 05:32:45 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:34.741 05:32:45 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:34.741 05:32:45 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:34.741 05:32:45 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:34.741 05:32:45 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:34.741 05:32:45 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:34.741 05:32:45 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:34.741 05:32:45 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:34.741 05:32:45 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:34.741 05:32:45 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:34.741 05:32:45 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:34.741 05:32:45 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:34.741 05:32:45 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:34.741 05:32:45 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:34.741 05:32:45 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:34.741 05:32:45 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:34.741 05:32:45 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:34.741 05:32:45 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:34.741 05:32:45 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:34.741 05:32:45 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:34.741 05:32:45 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:34.741 05:32:45 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:34.741 05:32:45 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:34.741 05:32:45 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:34.741 05:32:45 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:34.741 05:32:45 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:34.741 05:32:45 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:34.741 05:32:45 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:34.741 05:32:45 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:34.741 05:32:45 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:34.741 05:32:45 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:34.741 05:32:45 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:34.741 05:32:45 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:34.741 05:32:45 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:34.741 05:32:45 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:34.741 05:32:45 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:34.741 05:32:45 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:34.741 05:32:45 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:34.741 05:32:45 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:34.741 05:32:45 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:34.741 05:32:45 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:34.741 05:32:45 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:34.741 05:32:45 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:34.741 05:32:45 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:34.741 05:32:45 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:34.741 05:32:45 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:34.741 05:32:45 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:34.741 05:32:45 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:34.741 05:32:45 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:34.741 05:32:45 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:34.741 05:32:45 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:34.741 05:32:45 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:34.741 05:32:45 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:34.741 05:32:45 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:34.741 05:32:45 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:34.741 05:32:45 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:34.741 05:32:45 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:34.741 05:32:45 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:34.741 05:32:45 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:34.741 05:32:45 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:34.741 05:32:45 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:34.741 05:32:45 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:34.741 05:32:45 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:34.741 05:32:45 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:34.741 05:32:45 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:34.741 05:32:45 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:34.741 05:32:45 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:34.741 05:32:45 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:34.741 05:32:45 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:34.741 05:32:45 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:34.741 05:32:45 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:34.741 05:32:45 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:34.741 05:32:45 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:34.741 05:32:45 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:34.741 05:32:45 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:34.741 05:32:45 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:34.741 05:32:45 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:34.741 05:32:45 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:34.741 05:32:45 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:34.741 #define SPDK_CONFIG_H 00:07:34.741 #define SPDK_CONFIG_APPS 1 00:07:34.741 #define SPDK_CONFIG_ARCH native 00:07:34.741 #undef SPDK_CONFIG_ASAN 00:07:34.741 #undef SPDK_CONFIG_AVAHI 00:07:34.741 #undef SPDK_CONFIG_CET 00:07:34.741 #define SPDK_CONFIG_COVERAGE 1 00:07:34.741 #define SPDK_CONFIG_CROSS_PREFIX 00:07:34.741 #undef SPDK_CONFIG_CRYPTO 00:07:34.741 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:34.741 #undef SPDK_CONFIG_CUSTOMOCF 00:07:34.741 #undef SPDK_CONFIG_DAOS 00:07:34.741 #define SPDK_CONFIG_DAOS_DIR 00:07:34.741 #define SPDK_CONFIG_DEBUG 1 00:07:34.741 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:34.741 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:34.741 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:34.741 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:34.741 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:34.741 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:34.741 #define SPDK_CONFIG_EXAMPLES 1 00:07:34.741 #undef SPDK_CONFIG_FC 00:07:34.741 #define SPDK_CONFIG_FC_PATH 00:07:34.741 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:34.741 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:34.741 #undef SPDK_CONFIG_FUSE 00:07:34.741 #define SPDK_CONFIG_FUZZER 1 00:07:34.741 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:34.741 #undef SPDK_CONFIG_GOLANG 00:07:34.741 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:34.741 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:34.741 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:34.741 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:34.741 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:34.741 #define SPDK_CONFIG_IDXD 1 00:07:34.741 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:34.741 #undef SPDK_CONFIG_IPSEC_MB 00:07:34.741 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:34.741 #define SPDK_CONFIG_ISAL 1 00:07:34.741 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:34.741 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:34.741 #define SPDK_CONFIG_LIBDIR 00:07:34.741 #undef SPDK_CONFIG_LTO 00:07:34.741 #define SPDK_CONFIG_MAX_LCORES 00:07:34.741 #define SPDK_CONFIG_NVME_CUSE 1 00:07:34.741 #undef SPDK_CONFIG_OCF 00:07:34.741 #define SPDK_CONFIG_OCF_PATH 00:07:34.741 #define SPDK_CONFIG_OPENSSL_PATH 00:07:34.741 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:34.741 #undef SPDK_CONFIG_PGO_USE 00:07:34.741 #define SPDK_CONFIG_PREFIX /usr/local 00:07:34.741 #undef SPDK_CONFIG_RAID5F 00:07:34.741 #undef SPDK_CONFIG_RBD 00:07:34.741 #define SPDK_CONFIG_RDMA 1 00:07:34.741 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:34.741 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:34.741 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:34.741 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:34.741 #undef SPDK_CONFIG_SHARED 00:07:34.741 #undef SPDK_CONFIG_SMA 00:07:34.741 #define SPDK_CONFIG_TESTS 1 00:07:34.741 #undef SPDK_CONFIG_TSAN 00:07:34.741 #define SPDK_CONFIG_UBLK 1 00:07:34.741 #define SPDK_CONFIG_UBSAN 1 00:07:34.741 #undef SPDK_CONFIG_UNIT_TESTS 00:07:34.741 #undef SPDK_CONFIG_URING 00:07:34.741 #define SPDK_CONFIG_URING_PATH 00:07:34.741 #undef SPDK_CONFIG_URING_ZNS 00:07:34.741 #undef SPDK_CONFIG_USDT 00:07:34.741 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:34.741 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:34.741 #define SPDK_CONFIG_VFIO_USER 1 00:07:34.741 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:34.741 #define SPDK_CONFIG_VHOST 1 00:07:34.741 #define SPDK_CONFIG_VIRTIO 1 00:07:34.741 #undef SPDK_CONFIG_VTUNE 00:07:34.741 #define SPDK_CONFIG_VTUNE_DIR 00:07:34.741 #define SPDK_CONFIG_WERROR 1 00:07:34.741 #define SPDK_CONFIG_WPDK_DIR 00:07:34.741 #undef SPDK_CONFIG_XNVME 00:07:34.741 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:34.741 05:32:45 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:34.741 05:32:45 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:34.742 05:32:45 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:34.742 05:32:45 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:34.742 05:32:45 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:34.742 05:32:45 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:34.742 05:32:45 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:34.742 05:32:45 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:34.742 05:32:45 -- paths/export.sh@5 -- # export PATH 00:07:34.742 05:32:45 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:34.742 05:32:45 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:34.742 05:32:45 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:34.742 05:32:45 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:34.742 05:32:45 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:34.742 05:32:45 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:34.742 05:32:45 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:34.742 05:32:45 -- pm/common@16 -- # TEST_TAG=N/A 00:07:34.742 05:32:45 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:34.742 05:32:45 -- common/autotest_common.sh@52 -- # : 1 00:07:34.742 05:32:45 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:34.742 05:32:45 -- common/autotest_common.sh@56 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:34.742 05:32:45 -- common/autotest_common.sh@58 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:34.742 05:32:45 -- common/autotest_common.sh@60 -- # : 1 00:07:34.742 05:32:45 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:34.742 05:32:45 -- common/autotest_common.sh@62 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:34.742 05:32:45 -- common/autotest_common.sh@64 -- # : 00:07:34.742 05:32:45 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:34.742 05:32:45 -- common/autotest_common.sh@66 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:34.742 05:32:45 -- common/autotest_common.sh@68 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:34.742 05:32:45 -- common/autotest_common.sh@70 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:34.742 05:32:45 -- common/autotest_common.sh@72 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:34.742 05:32:45 -- common/autotest_common.sh@74 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:34.742 05:32:45 -- common/autotest_common.sh@76 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:34.742 05:32:45 -- common/autotest_common.sh@78 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:34.742 05:32:45 -- common/autotest_common.sh@80 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:34.742 05:32:45 -- common/autotest_common.sh@82 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:34.742 05:32:45 -- common/autotest_common.sh@84 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:34.742 05:32:45 -- common/autotest_common.sh@86 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:34.742 05:32:45 -- common/autotest_common.sh@88 -- # : 0 00:07:34.742 05:32:45 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:34.742 05:32:46 -- common/autotest_common.sh@90 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:34.742 05:32:46 -- common/autotest_common.sh@92 -- # : 1 00:07:34.742 05:32:46 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:34.742 05:32:46 -- common/autotest_common.sh@94 -- # : 1 00:07:34.742 05:32:46 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:34.742 05:32:46 -- common/autotest_common.sh@96 -- # : rdma 00:07:34.742 05:32:46 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:34.742 05:32:46 -- common/autotest_common.sh@98 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:34.742 05:32:46 -- common/autotest_common.sh@100 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:34.742 05:32:46 -- common/autotest_common.sh@102 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:34.742 05:32:46 -- common/autotest_common.sh@104 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:34.742 05:32:46 -- common/autotest_common.sh@106 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:34.742 05:32:46 -- common/autotest_common.sh@108 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:34.742 05:32:46 -- common/autotest_common.sh@110 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:34.742 05:32:46 -- common/autotest_common.sh@112 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:34.742 05:32:46 -- common/autotest_common.sh@114 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:34.742 05:32:46 -- common/autotest_common.sh@116 -- # : 1 00:07:34.742 05:32:46 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:34.742 05:32:46 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:34.742 05:32:46 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:34.742 05:32:46 -- common/autotest_common.sh@120 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:34.742 05:32:46 -- common/autotest_common.sh@122 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:34.742 05:32:46 -- common/autotest_common.sh@124 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:34.742 05:32:46 -- common/autotest_common.sh@126 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:34.742 05:32:46 -- common/autotest_common.sh@128 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:34.742 05:32:46 -- common/autotest_common.sh@130 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:34.742 05:32:46 -- common/autotest_common.sh@132 -- # : v22.11.4 00:07:34.742 05:32:46 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:34.742 05:32:46 -- common/autotest_common.sh@134 -- # : true 00:07:34.742 05:32:46 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:34.742 05:32:46 -- common/autotest_common.sh@136 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:34.742 05:32:46 -- common/autotest_common.sh@138 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:34.742 05:32:46 -- common/autotest_common.sh@140 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:34.742 05:32:46 -- common/autotest_common.sh@142 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:34.742 05:32:46 -- common/autotest_common.sh@144 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:34.742 05:32:46 -- common/autotest_common.sh@146 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:34.742 05:32:46 -- common/autotest_common.sh@148 -- # : 00:07:34.742 05:32:46 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:34.742 05:32:46 -- common/autotest_common.sh@150 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:34.742 05:32:46 -- common/autotest_common.sh@152 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:34.742 05:32:46 -- common/autotest_common.sh@154 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:34.742 05:32:46 -- common/autotest_common.sh@156 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:34.742 05:32:46 -- common/autotest_common.sh@158 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:34.742 05:32:46 -- common/autotest_common.sh@160 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:34.742 05:32:46 -- common/autotest_common.sh@163 -- # : 00:07:34.742 05:32:46 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:34.742 05:32:46 -- common/autotest_common.sh@165 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:34.742 05:32:46 -- common/autotest_common.sh@167 -- # : 0 00:07:34.742 05:32:46 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:34.742 05:32:46 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:34.742 05:32:46 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:34.743 05:32:46 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:34.743 05:32:46 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:34.743 05:32:46 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:34.743 05:32:46 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:34.743 05:32:46 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:34.743 05:32:46 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:34.743 05:32:46 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:34.743 05:32:46 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:34.743 05:32:46 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:34.743 05:32:46 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:34.743 05:32:46 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:34.743 05:32:46 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:34.743 05:32:46 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:34.743 05:32:46 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:34.743 05:32:46 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:34.743 05:32:46 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:34.743 05:32:46 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:34.743 05:32:46 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:34.743 05:32:46 -- common/autotest_common.sh@196 -- # cat 00:07:34.743 05:32:46 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:34.743 05:32:46 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:34.743 05:32:46 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:34.743 05:32:46 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:34.743 05:32:46 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:34.743 05:32:46 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:34.743 05:32:46 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:34.743 05:32:46 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:34.743 05:32:46 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:34.743 05:32:46 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:34.743 05:32:46 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:34.743 05:32:46 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:34.743 05:32:46 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:34.743 05:32:46 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:34.743 05:32:46 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:34.743 05:32:46 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:34.743 05:32:46 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:34.743 05:32:46 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:34.743 05:32:46 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:34.743 05:32:46 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:34.743 05:32:46 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:34.743 05:32:46 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:34.743 05:32:46 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:34.743 05:32:46 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:34.743 05:32:46 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:34.743 05:32:46 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:34.743 05:32:46 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:34.743 05:32:46 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:34.743 05:32:46 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:34.743 05:32:46 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:34.743 05:32:46 -- common/autotest_common.sh@259 -- # valgrind= 00:07:34.743 05:32:46 -- common/autotest_common.sh@265 -- # uname -s 00:07:34.743 05:32:46 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:34.743 05:32:46 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:34.743 05:32:46 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:34.743 05:32:46 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:34.743 05:32:46 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:34.743 05:32:46 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:34.743 05:32:46 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:34.743 05:32:46 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:34.743 05:32:46 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:34.743 05:32:46 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:34.743 05:32:46 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:34.743 05:32:46 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:34.743 05:32:46 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:34.743 05:32:46 -- common/autotest_common.sh@319 -- # [[ -z 2212845 ]] 00:07:34.743 05:32:46 -- common/autotest_common.sh@319 -- # kill -0 2212845 00:07:35.003 05:32:46 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:35.003 05:32:46 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:35.003 05:32:46 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:35.003 05:32:46 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:35.003 05:32:46 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:35.003 05:32:46 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:35.003 05:32:46 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:35.003 05:32:46 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:35.003 05:32:46 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.VHkM9w 00:07:35.003 05:32:46 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:35.003 05:32:46 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:35.003 05:32:46 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:35.003 05:32:46 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.VHkM9w/tests/nvmf /tmp/spdk.VHkM9w 00:07:35.003 05:32:46 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:35.003 05:32:46 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:35.003 05:32:46 -- common/autotest_common.sh@328 -- # df -T 00:07:35.003 05:32:46 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:35.003 05:32:46 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:35.003 05:32:46 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:35.003 05:32:46 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:07:35.003 05:32:46 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # avails["$mount"]=51930976256 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:07:35.003 05:32:46 -- common/autotest_common.sh@364 -- # uses["$mount"]=9799630848 00:07:35.003 05:32:46 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862708736 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:07:35.003 05:32:46 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:07:35.003 05:32:46 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340129792 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:07:35.003 05:32:46 -- common/autotest_common.sh@364 -- # uses["$mount"]=5992448 00:07:35.003 05:32:46 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # avails["$mount"]=30863441920 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:07:35.003 05:32:46 -- common/autotest_common.sh@364 -- # uses["$mount"]=1863680 00:07:35.003 05:32:46 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:35.003 05:32:46 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:35.003 05:32:46 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:35.003 05:32:46 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:35.003 05:32:46 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:35.003 05:32:46 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:35.003 * Looking for test storage... 00:07:35.003 05:32:46 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:35.003 05:32:46 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:35.003 05:32:46 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:35.003 05:32:46 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:35.003 05:32:46 -- common/autotest_common.sh@373 -- # mount=/ 00:07:35.003 05:32:46 -- common/autotest_common.sh@375 -- # target_space=51930976256 00:07:35.003 05:32:46 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:35.003 05:32:46 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:35.003 05:32:46 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:35.004 05:32:46 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:35.004 05:32:46 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:35.004 05:32:46 -- common/autotest_common.sh@382 -- # new_size=12014223360 00:07:35.004 05:32:46 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:35.004 05:32:46 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:35.004 05:32:46 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:35.004 05:32:46 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:35.004 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:35.004 05:32:46 -- common/autotest_common.sh@390 -- # return 0 00:07:35.004 05:32:46 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:35.004 05:32:46 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:35.004 05:32:46 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:35.004 05:32:46 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:35.004 05:32:46 -- common/autotest_common.sh@1682 -- # true 00:07:35.004 05:32:46 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:35.004 05:32:46 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:35.004 05:32:46 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:35.004 05:32:46 -- common/autotest_common.sh@27 -- # exec 00:07:35.004 05:32:46 -- common/autotest_common.sh@29 -- # exec 00:07:35.004 05:32:46 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:35.004 05:32:46 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:35.004 05:32:46 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:35.004 05:32:46 -- common/autotest_common.sh@18 -- # set -x 00:07:35.004 05:32:46 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:35.004 05:32:46 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:35.004 05:32:46 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:35.004 05:32:46 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:35.004 05:32:46 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:35.004 05:32:46 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:35.004 05:32:46 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:35.004 05:32:46 -- scripts/common.sh@335 -- # IFS=.-: 00:07:35.004 05:32:46 -- scripts/common.sh@335 -- # read -ra ver1 00:07:35.004 05:32:46 -- scripts/common.sh@336 -- # IFS=.-: 00:07:35.004 05:32:46 -- scripts/common.sh@336 -- # read -ra ver2 00:07:35.004 05:32:46 -- scripts/common.sh@337 -- # local 'op=<' 00:07:35.004 05:32:46 -- scripts/common.sh@339 -- # ver1_l=2 00:07:35.004 05:32:46 -- scripts/common.sh@340 -- # ver2_l=1 00:07:35.004 05:32:46 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:35.004 05:32:46 -- scripts/common.sh@343 -- # case "$op" in 00:07:35.004 05:32:46 -- scripts/common.sh@344 -- # : 1 00:07:35.004 05:32:46 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:35.004 05:32:46 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:35.004 05:32:46 -- scripts/common.sh@364 -- # decimal 1 00:07:35.004 05:32:46 -- scripts/common.sh@352 -- # local d=1 00:07:35.004 05:32:46 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:35.004 05:32:46 -- scripts/common.sh@354 -- # echo 1 00:07:35.004 05:32:46 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:35.004 05:32:46 -- scripts/common.sh@365 -- # decimal 2 00:07:35.004 05:32:46 -- scripts/common.sh@352 -- # local d=2 00:07:35.004 05:32:46 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:35.004 05:32:46 -- scripts/common.sh@354 -- # echo 2 00:07:35.004 05:32:46 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:35.004 05:32:46 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:35.004 05:32:46 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:35.004 05:32:46 -- scripts/common.sh@367 -- # return 0 00:07:35.004 05:32:46 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:35.004 05:32:46 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:35.004 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.004 --rc genhtml_branch_coverage=1 00:07:35.004 --rc genhtml_function_coverage=1 00:07:35.004 --rc genhtml_legend=1 00:07:35.004 --rc geninfo_all_blocks=1 00:07:35.004 --rc geninfo_unexecuted_blocks=1 00:07:35.004 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.004 ' 00:07:35.004 05:32:46 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:35.004 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.004 --rc genhtml_branch_coverage=1 00:07:35.004 --rc genhtml_function_coverage=1 00:07:35.004 --rc genhtml_legend=1 00:07:35.004 --rc geninfo_all_blocks=1 00:07:35.004 --rc geninfo_unexecuted_blocks=1 00:07:35.004 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.004 ' 00:07:35.004 05:32:46 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:35.004 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.004 --rc genhtml_branch_coverage=1 00:07:35.004 --rc genhtml_function_coverage=1 00:07:35.004 --rc genhtml_legend=1 00:07:35.004 --rc geninfo_all_blocks=1 00:07:35.004 --rc geninfo_unexecuted_blocks=1 00:07:35.004 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.004 ' 00:07:35.004 05:32:46 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:35.004 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.004 --rc genhtml_branch_coverage=1 00:07:35.004 --rc genhtml_function_coverage=1 00:07:35.004 --rc genhtml_legend=1 00:07:35.004 --rc geninfo_all_blocks=1 00:07:35.004 --rc geninfo_unexecuted_blocks=1 00:07:35.004 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.004 ' 00:07:35.004 05:32:46 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:35.004 05:32:46 -- ../common.sh@8 -- # pids=() 00:07:35.004 05:32:46 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:35.004 05:32:46 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:35.004 05:32:46 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:35.004 05:32:46 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:35.004 05:32:46 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:35.004 05:32:46 -- nvmf/run.sh@61 -- # mem_size=512 00:07:35.004 05:32:46 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:35.004 05:32:46 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:35.004 05:32:46 -- ../common.sh@69 -- # local fuzz_num=25 00:07:35.004 05:32:46 -- ../common.sh@70 -- # local time=1 00:07:35.004 05:32:46 -- ../common.sh@72 -- # (( i = 0 )) 00:07:35.004 05:32:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:35.004 05:32:46 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:35.004 05:32:46 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:35.004 05:32:46 -- nvmf/run.sh@24 -- # local timen=1 00:07:35.004 05:32:46 -- nvmf/run.sh@25 -- # local core=0x1 00:07:35.004 05:32:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:35.004 05:32:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:35.004 05:32:46 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:35.004 05:32:46 -- nvmf/run.sh@29 -- # port=4400 00:07:35.004 05:32:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:35.004 05:32:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:35.004 05:32:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:35.004 05:32:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:35.004 [2024-11-29 05:32:46.238058] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:35.004 [2024-11-29 05:32:46.238155] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2212908 ] 00:07:35.004 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.263 [2024-11-29 05:32:46.497093] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.263 [2024-11-29 05:32:46.524371] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:35.263 [2024-11-29 05:32:46.524493] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.522 [2024-11-29 05:32:46.575795] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.522 [2024-11-29 05:32:46.592183] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:35.522 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.522 INFO: Seed: 2870153827 00:07:35.522 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:35.522 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:35.522 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:35.522 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.522 #2 INITED exec/s: 0 rss: 59Mb 00:07:35.522 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:35.522 This may also happen if the target rejected all inputs we tried so far 00:07:35.781 NEW_FUNC[1/659]: 0x451418 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:35.781 NEW_FUNC[2/659]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:35.781 #6 NEW cov: 11463 ft: 11464 corp: 2/125b lim: 320 exec/s: 0 rss: 66Mb L: 124/124 MS: 4 CopyPart-InsertRepeatedBytes-ChangeByte-InsertRepeatedBytes- 00:07:35.781 [2024-11-29 05:32:46.979011] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.781 [2024-11-29 05:32:46.979055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.781 NEW_FUNC[1/14]: 0x16b5478 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:07:35.781 NEW_FUNC[2/14]: 0x16b56b8 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:07:35.781 #7 NEW cov: 11705 ft: 12538 corp: 3/202b lim: 320 exec/s: 0 rss: 66Mb L: 77/124 MS: 1 InsertRepeatedBytes- 00:07:35.781 [2024-11-29 05:32:47.018989] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.781 [2024-11-29 05:32:47.019018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.781 #13 NEW cov: 11711 ft: 12913 corp: 4/279b lim: 320 exec/s: 0 rss: 66Mb L: 77/124 MS: 1 ChangeBinInt- 00:07:36.038 #14 NEW cov: 11796 ft: 13188 corp: 5/404b lim: 320 exec/s: 0 rss: 66Mb L: 125/125 MS: 1 CrossOver- 00:07:36.038 [2024-11-29 05:32:47.099102] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:9c9c9c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x9c9c9c9c9c9c9c9c 00:07:36.038 [2024-11-29 05:32:47.099134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.038 #15 NEW cov: 11796 ft: 13336 corp: 6/513b lim: 320 exec/s: 0 rss: 66Mb L: 109/125 MS: 1 InsertRepeatedBytes- 00:07:36.038 [2024-11-29 05:32:47.139737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e6) qid:0 cid:5 nsid:e6e6e6e6 cdw10:08e6e6e6 cdw11:08080808 SGL TRANSPORT DATA BLOCK TRANSPORT 0xe6e6e6e6e6e6e6e6 00:07:36.038 [2024-11-29 05:32:47.139765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.038 [2024-11-29 05:32:47.139897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (fd) qid:0 cid:6 nsid:fdfdfdfd cdw10:fdfdfdfd cdw11:fdfdfdfd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.038 [2024-11-29 05:32:47.139916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.038 NEW_FUNC[1/1]: 0x16dd468 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:36.038 #21 NEW cov: 11831 ft: 14038 corp: 7/721b lim: 320 exec/s: 0 rss: 66Mb L: 208/208 MS: 1 InsertRepeatedBytes- 00:07:36.038 #22 NEW cov: 11831 ft: 14080 corp: 8/846b lim: 320 exec/s: 0 rss: 66Mb L: 125/208 MS: 1 CopyPart- 00:07:36.038 [2024-11-29 05:32:47.229503] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x10000000 00:07:36.038 [2024-11-29 05:32:47.229531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.038 #23 NEW cov: 11831 ft: 14133 corp: 9/923b lim: 320 exec/s: 0 rss: 66Mb L: 77/208 MS: 1 ChangeBit- 00:07:36.038 [2024-11-29 05:32:47.269637] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.038 [2024-11-29 05:32:47.269665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.038 #29 NEW cov: 11831 ft: 14177 corp: 10/1000b lim: 320 exec/s: 0 rss: 66Mb L: 77/208 MS: 1 CrossOver- 00:07:36.038 [2024-11-29 05:32:47.319832] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.038 [2024-11-29 05:32:47.319860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.038 #30 NEW cov: 11831 ft: 14229 corp: 11/1078b lim: 320 exec/s: 0 rss: 66Mb L: 78/208 MS: 1 InsertByte- 00:07:36.295 [2024-11-29 05:32:47.360118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e6) qid:0 cid:5 nsid:e6e6e6e6 cdw10:08e6e6e6 cdw11:08080808 SGL TRANSPORT DATA BLOCK TRANSPORT 0xe6e6e6e6e6e6e6e6 00:07:36.295 [2024-11-29 05:32:47.360146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.295 #31 NEW cov: 11831 ft: 14306 corp: 12/1236b lim: 320 exec/s: 0 rss: 67Mb L: 158/208 MS: 1 CrossOver- 00:07:36.295 [2024-11-29 05:32:47.400065] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.295 [2024-11-29 05:32:47.400093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.295 #32 NEW cov: 11831 ft: 14320 corp: 13/1313b lim: 320 exec/s: 0 rss: 68Mb L: 77/208 MS: 1 ChangeByte- 00:07:36.295 [2024-11-29 05:32:47.440201] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.295 [2024-11-29 05:32:47.440230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.295 #33 NEW cov: 11831 ft: 14340 corp: 14/1390b lim: 320 exec/s: 0 rss: 68Mb L: 77/208 MS: 1 ChangeBit- 00:07:36.296 [2024-11-29 05:32:47.480408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:e6e6e600 cdw11:e6e6e6e6 00:07:36.296 [2024-11-29 05:32:47.480435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.296 #34 NEW cov: 11833 ft: 14383 corp: 15/1578b lim: 320 exec/s: 0 rss: 68Mb L: 188/208 MS: 1 CrossOver- 00:07:36.296 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:36.296 #35 NEW cov: 11856 ft: 14471 corp: 16/1703b lim: 320 exec/s: 0 rss: 68Mb L: 125/208 MS: 1 ChangeBit- 00:07:36.296 #36 NEW cov: 11856 ft: 14499 corp: 17/1828b lim: 320 exec/s: 0 rss: 68Mb L: 125/208 MS: 1 CopyPart- 00:07:36.554 [2024-11-29 05:32:47.600709] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.554 [2024-11-29 05:32:47.600739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.554 #37 NEW cov: 11856 ft: 14514 corp: 18/1906b lim: 320 exec/s: 0 rss: 68Mb L: 78/208 MS: 1 InsertByte- 00:07:36.554 [2024-11-29 05:32:47.640823] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x10000000 00:07:36.554 [2024-11-29 05:32:47.640849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.554 #38 NEW cov: 11856 ft: 14528 corp: 19/1983b lim: 320 exec/s: 38 rss: 68Mb L: 77/208 MS: 1 ChangeByte- 00:07:36.554 [2024-11-29 05:32:47.680886] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x10000000 00:07:36.554 [2024-11-29 05:32:47.680914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.554 #39 NEW cov: 11856 ft: 14544 corp: 20/2060b lim: 320 exec/s: 39 rss: 68Mb L: 77/208 MS: 1 ShuffleBytes- 00:07:36.554 [2024-11-29 05:32:47.721037] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.554 [2024-11-29 05:32:47.721065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.554 #40 NEW cov: 11856 ft: 14555 corp: 21/2128b lim: 320 exec/s: 40 rss: 68Mb L: 68/208 MS: 1 EraseBytes- 00:07:36.554 [2024-11-29 05:32:47.761358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e6) qid:0 cid:5 nsid:e6e6e6e6 cdw10:08e6e6e6 cdw11:08080808 SGL TRANSPORT DATA BLOCK TRANSPORT 0xe6e6e6e6e6e6e6e6 00:07:36.554 [2024-11-29 05:32:47.761384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.554 #41 NEW cov: 11856 ft: 14562 corp: 22/2286b lim: 320 exec/s: 41 rss: 68Mb L: 158/208 MS: 1 ChangeBinInt- 00:07:36.554 [2024-11-29 05:32:47.801571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e6) qid:0 cid:5 nsid:e6e6e6e6 cdw10:e6e6e6e6 cdw11:08080808 SGL TRANSPORT DATA BLOCK TRANSPORT 0xe6e6e6e6e6e6e6e6 00:07:36.554 [2024-11-29 05:32:47.801603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.554 #42 NEW cov: 11856 ft: 14569 corp: 23/2445b lim: 320 exec/s: 42 rss: 68Mb L: 159/208 MS: 1 InsertByte- 00:07:36.554 [2024-11-29 05:32:47.841443] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x10000000 00:07:36.554 [2024-11-29 05:32:47.841471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.813 #43 NEW cov: 11856 ft: 14579 corp: 24/2522b lim: 320 exec/s: 43 rss: 68Mb L: 77/208 MS: 1 ChangeBit- 00:07:36.813 [2024-11-29 05:32:47.882197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e6) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0xe6e6e6e6e6e6e6e6 00:07:36.813 [2024-11-29 05:32:47.882224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.813 [2024-11-29 05:32:47.882360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e6) qid:0 cid:6 nsid:8080808 cdw10:00080000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.813 [2024-11-29 05:32:47.882376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.813 #44 NEW cov: 11860 ft: 14608 corp: 25/2731b lim: 320 exec/s: 44 rss: 68Mb L: 209/209 MS: 1 CrossOver- 00:07:36.813 [2024-11-29 05:32:47.921712] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.813 [2024-11-29 05:32:47.921740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.813 #45 NEW cov: 11860 ft: 14634 corp: 26/2808b lim: 320 exec/s: 45 rss: 68Mb L: 77/209 MS: 1 ChangeByte- 00:07:36.813 [2024-11-29 05:32:47.971932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e6) qid:0 cid:5 nsid:e6e6e6e6 cdw10:e67ae6e6 cdw11:08080808 SGL TRANSPORT DATA BLOCK TRANSPORT 0xe6e6e6e6e6e6e6e6 00:07:36.813 [2024-11-29 05:32:47.971959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.813 #46 NEW cov: 11860 ft: 14646 corp: 27/2967b lim: 320 exec/s: 46 rss: 68Mb L: 159/209 MS: 1 InsertByte- 00:07:36.813 [2024-11-29 05:32:48.012004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:e6e6e600 cdw11:e6e6e6e6 00:07:36.813 [2024-11-29 05:32:48.012030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.813 #52 NEW cov: 11860 ft: 14649 corp: 28/3155b lim: 320 exec/s: 52 rss: 68Mb L: 188/209 MS: 1 ShuffleBytes- 00:07:36.813 [2024-11-29 05:32:48.051995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.813 [2024-11-29 05:32:48.052021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.813 #53 NEW cov: 11860 ft: 14652 corp: 29/3233b lim: 320 exec/s: 53 rss: 68Mb L: 78/209 MS: 1 InsertByte- 00:07:36.813 [2024-11-29 05:32:48.092432] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:39393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x3939393939393939 00:07:36.813 [2024-11-29 05:32:48.092459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.813 [2024-11-29 05:32:48.092594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (39) qid:0 cid:5 nsid:39393939 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.813 [2024-11-29 05:32:48.092613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.813 #54 NEW cov: 11860 ft: 14673 corp: 30/3402b lim: 320 exec/s: 54 rss: 68Mb L: 169/209 MS: 1 InsertRepeatedBytes- 00:07:37.073 [2024-11-29 05:32:48.132350] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.073 [2024-11-29 05:32:48.132377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.073 #55 NEW cov: 11860 ft: 14690 corp: 31/3480b lim: 320 exec/s: 55 rss: 68Mb L: 78/209 MS: 1 InsertByte- 00:07:37.073 [2024-11-29 05:32:48.172433] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.073 [2024-11-29 05:32:48.172460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.073 #56 NEW cov: 11860 ft: 14740 corp: 32/3557b lim: 320 exec/s: 56 rss: 68Mb L: 77/209 MS: 1 ShuffleBytes- 00:07:37.073 [2024-11-29 05:32:48.212555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:ecececec cdw10:ecececec cdw11:ecececec SGL TRANSPORT DATA BLOCK TRANSPORT 0xecececececececec 00:07:37.073 [2024-11-29 05:32:48.212584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.073 NEW_FUNC[1/1]: 0x12de638 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2016 00:07:37.073 #61 NEW cov: 11891 ft: 14777 corp: 33/3655b lim: 320 exec/s: 61 rss: 68Mb L: 98/209 MS: 5 CopyPart-InsertByte-EraseBytes-InsertByte-InsertRepeatedBytes- 00:07:37.073 [2024-11-29 05:32:48.252682] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.073 [2024-11-29 05:32:48.252708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.073 #62 NEW cov: 11891 ft: 14778 corp: 34/3732b lim: 320 exec/s: 62 rss: 69Mb L: 77/209 MS: 1 ChangeByte- 00:07:37.073 #63 NEW cov: 11891 ft: 14786 corp: 35/3857b lim: 320 exec/s: 63 rss: 69Mb L: 125/209 MS: 1 ShuffleBytes- 00:07:37.073 [2024-11-29 05:32:48.332947] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.073 [2024-11-29 05:32:48.332976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.073 #64 NEW cov: 11891 ft: 14796 corp: 36/3934b lim: 320 exec/s: 64 rss: 69Mb L: 77/209 MS: 1 ChangeBinInt- 00:07:37.073 [2024-11-29 05:32:48.373389] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x10000000 00:07:37.073 [2024-11-29 05:32:48.373417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.073 [2024-11-29 05:32:48.373538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (59) qid:0 cid:5 nsid:59595959 cdw10:59595959 cdw11:59595959 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.073 [2024-11-29 05:32:48.373553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.332 #65 NEW cov: 11891 ft: 14800 corp: 37/4098b lim: 320 exec/s: 65 rss: 69Mb L: 164/209 MS: 1 InsertRepeatedBytes- 00:07:37.332 #66 NEW cov: 11891 ft: 14805 corp: 38/4202b lim: 320 exec/s: 66 rss: 69Mb L: 104/209 MS: 1 EraseBytes- 00:07:37.332 #67 NEW cov: 11891 ft: 14809 corp: 39/4326b lim: 320 exec/s: 67 rss: 69Mb L: 124/209 MS: 1 ShuffleBytes- 00:07:37.332 [2024-11-29 05:32:48.503495] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:9c9c9c9c SGL TRANSPORT DATA BLOCK TRANSPORT 0x9c9c9c9c9c9c9c9c 00:07:37.332 [2024-11-29 05:32:48.503525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.332 #68 NEW cov: 11891 ft: 14815 corp: 40/4435b lim: 320 exec/s: 68 rss: 69Mb L: 109/209 MS: 1 ShuffleBytes- 00:07:37.332 [2024-11-29 05:32:48.553660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x10000000 00:07:37.332 [2024-11-29 05:32:48.553692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.332 #69 NEW cov: 11891 ft: 14835 corp: 41/4512b lim: 320 exec/s: 69 rss: 69Mb L: 77/209 MS: 1 ChangeByte- 00:07:37.332 [2024-11-29 05:32:48.593931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (60) qid:0 cid:5 nsid:e6e6e6e6 cdw10:08e6e6e6 cdw11:08080808 00:07:37.332 [2024-11-29 05:32:48.593960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.332 #70 NEW cov: 11891 ft: 14847 corp: 42/4670b lim: 320 exec/s: 70 rss: 69Mb L: 158/209 MS: 1 ChangeByte- 00:07:37.332 [2024-11-29 05:32:48.633851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.332 [2024-11-29 05:32:48.633879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.592 #71 NEW cov: 11891 ft: 14878 corp: 43/4748b lim: 320 exec/s: 35 rss: 69Mb L: 78/209 MS: 1 CrossOver- 00:07:37.592 #71 DONE cov: 11891 ft: 14878 corp: 43/4748b lim: 320 exec/s: 35 rss: 69Mb 00:07:37.592 Done 71 runs in 2 second(s) 00:07:37.592 05:32:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:37.592 05:32:48 -- ../common.sh@72 -- # (( i++ )) 00:07:37.592 05:32:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:37.592 05:32:48 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:37.592 05:32:48 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:37.592 05:32:48 -- nvmf/run.sh@24 -- # local timen=1 00:07:37.592 05:32:48 -- nvmf/run.sh@25 -- # local core=0x1 00:07:37.592 05:32:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:37.592 05:32:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:37.592 05:32:48 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:37.592 05:32:48 -- nvmf/run.sh@29 -- # port=4401 00:07:37.592 05:32:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:37.592 05:32:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:37.592 05:32:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:37.592 05:32:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:37.592 [2024-11-29 05:32:48.819321] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:37.592 [2024-11-29 05:32:48.819384] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2213420 ] 00:07:37.592 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.852 [2024-11-29 05:32:49.065966] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.852 [2024-11-29 05:32:49.093546] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:37.852 [2024-11-29 05:32:49.093671] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.852 [2024-11-29 05:32:49.145000] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:38.111 [2024-11-29 05:32:49.161366] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:38.111 INFO: Running with entropic power schedule (0xFF, 100). 00:07:38.111 INFO: Seed: 1145192754 00:07:38.111 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:38.111 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:38.111 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:38.111 INFO: A corpus is not provided, starting from an empty corpus 00:07:38.111 #2 INITED exec/s: 0 rss: 59Mb 00:07:38.111 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:38.111 This may also happen if the target rejected all inputs we tried so far 00:07:38.111 [2024-11-29 05:32:49.206377] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.111 [2024-11-29 05:32:49.206495] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.111 [2024-11-29 05:32:49.206607] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.111 [2024-11-29 05:32:49.206829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.111 [2024-11-29 05:32:49.206859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.111 [2024-11-29 05:32:49.206914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.111 [2024-11-29 05:32:49.206931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.111 [2024-11-29 05:32:49.206981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.111 [2024-11-29 05:32:49.206994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.369 NEW_FUNC[1/671]: 0x451d18 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:38.369 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.369 #3 NEW cov: 11626 ft: 11618 corp: 2/20b lim: 30 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:38.369 [2024-11-29 05:32:49.518609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.518664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.369 [2024-11-29 05:32:49.518803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.518827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.369 [2024-11-29 05:32:49.518968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.518991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.369 #9 NEW cov: 11771 ft: 12408 corp: 3/42b lim: 30 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:07:38.369 [2024-11-29 05:32:49.557752] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.369 [2024-11-29 05:32:49.557929] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.369 [2024-11-29 05:32:49.558071] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.369 [2024-11-29 05:32:49.558416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.558445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.369 [2024-11-29 05:32:49.558563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.558579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.369 [2024-11-29 05:32:49.558701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.558719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.369 #20 NEW cov: 11777 ft: 12663 corp: 4/61b lim: 30 exec/s: 0 rss: 67Mb L: 19/22 MS: 1 ShuffleBytes- 00:07:38.369 [2024-11-29 05:32:49.598622] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:07:38.369 [2024-11-29 05:32:49.598979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.599009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.369 [2024-11-29 05:32:49.599136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.599157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.369 [2024-11-29 05:32:49.599283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.599302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.369 [2024-11-29 05:32:49.599419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.599438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.369 #21 NEW cov: 11862 ft: 13378 corp: 5/85b lim: 30 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 CopyPart- 00:07:38.369 [2024-11-29 05:32:49.647858] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.369 [2024-11-29 05:32:49.648022] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3e 00:07:38.369 [2024-11-29 05:32:49.648181] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.369 [2024-11-29 05:32:49.648549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.648578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.369 [2024-11-29 05:32:49.648696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3e3e003e cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.648725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.369 [2024-11-29 05:32:49.648840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.369 [2024-11-29 05:32:49.648859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.628 #22 NEW cov: 11862 ft: 13557 corp: 6/106b lim: 30 exec/s: 0 rss: 67Mb L: 21/24 MS: 1 CrossOver- 00:07:38.628 [2024-11-29 05:32:49.688409] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.628 [2024-11-29 05:32:49.688580] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffc1 00:07:38.628 [2024-11-29 05:32:49.688745] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.628 [2024-11-29 05:32:49.689084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.689115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.689252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3ec783c1 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.689272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.689398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:c1c102c1 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.689416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.628 #23 NEW cov: 11862 ft: 13608 corp: 7/127b lim: 30 exec/s: 0 rss: 67Mb L: 21/24 MS: 1 ChangeBinInt- 00:07:38.628 [2024-11-29 05:32:49.738585] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:38.628 [2024-11-29 05:32:49.739424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.739454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.739580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000f4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.739603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.739731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.739760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.739881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.739903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.628 #28 NEW cov: 11870 ft: 13803 corp: 8/153b lim: 30 exec/s: 0 rss: 67Mb L: 26/26 MS: 5 ShuffleBytes-CopyPart-InsertByte-ChangeByte-CrossOver- 00:07:38.628 [2024-11-29 05:32:49.778990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.779018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.779145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.779174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.628 #29 NEW cov: 11870 ft: 14279 corp: 9/166b lim: 30 exec/s: 0 rss: 67Mb L: 13/26 MS: 1 EraseBytes- 00:07:38.628 [2024-11-29 05:32:49.818778] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:38.628 [2024-11-29 05:32:49.819214] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1a 00:07:38.628 [2024-11-29 05:32:49.819574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.819607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.819728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000f4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.819746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.819865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.819882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.820009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.820027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.628 #30 NEW cov: 11870 ft: 14292 corp: 10/192b lim: 30 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 ChangeBinInt- 00:07:38.628 [2024-11-29 05:32:49.868579] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000293e 00:07:38.628 [2024-11-29 05:32:49.868775] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.628 [2024-11-29 05:32:49.868937] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.628 [2024-11-29 05:32:49.869309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.869340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.869474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.869493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.869621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.869638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.628 #31 NEW cov: 11870 ft: 14321 corp: 11/212b lim: 30 exec/s: 0 rss: 67Mb L: 20/26 MS: 1 InsertByte- 00:07:38.628 [2024-11-29 05:32:49.919071] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002d3e 00:07:38.628 [2024-11-29 05:32:49.919254] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.628 [2024-11-29 05:32:49.919415] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:38.628 [2024-11-29 05:32:49.919783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.919815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.919937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.919956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.628 [2024-11-29 05:32:49.920085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.628 [2024-11-29 05:32:49.920103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.886 #32 NEW cov: 11870 ft: 14349 corp: 12/232b lim: 30 exec/s: 0 rss: 68Mb L: 20/26 MS: 1 ChangeBit- 00:07:38.886 [2024-11-29 05:32:49.979322] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:38.886 [2024-11-29 05:32:49.979649] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xfb 00:07:38.886 [2024-11-29 05:32:49.980154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.886 [2024-11-29 05:32:49.980185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.886 [2024-11-29 05:32:49.980310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000f4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.886 [2024-11-29 05:32:49.980330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.886 [2024-11-29 05:32:49.980467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.886 [2024-11-29 05:32:49.980488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.886 [2024-11-29 05:32:49.980622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.886 [2024-11-29 05:32:49.980642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.886 #33 NEW cov: 11870 ft: 14389 corp: 13/258b lim: 30 exec/s: 0 rss: 68Mb L: 26/26 MS: 1 ChangeBinInt- 00:07:38.886 [2024-11-29 05:32:50.029499] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (201492) > buf size (4096) 00:07:38.886 [2024-11-29 05:32:50.029689] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (201492) > buf size (4096) 00:07:38.886 [2024-11-29 05:32:50.029853] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (201492) > buf size (4096) 00:07:38.886 [2024-11-29 05:32:50.030021] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (201492) > buf size (4096) 00:07:38.886 [2024-11-29 05:32:50.030391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:c4c400c4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.886 [2024-11-29 05:32:50.030421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.886 [2024-11-29 05:32:50.030554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:c4c400c4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.886 [2024-11-29 05:32:50.030573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.886 [2024-11-29 05:32:50.030690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:c4c400c4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.886 [2024-11-29 05:32:50.030707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.886 [2024-11-29 05:32:50.030828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:c4c400c4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.886 [2024-11-29 05:32:50.030847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.886 #38 NEW cov: 11870 ft: 14444 corp: 14/286b lim: 30 exec/s: 0 rss: 68Mb L: 28/28 MS: 5 CrossOver-CMP-ChangeBit-EraseBytes-InsertRepeatedBytes- DE: "\377\377\377\377"- 00:07:38.886 [2024-11-29 05:32:50.079729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.886 [2024-11-29 05:32:50.079760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.886 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.887 #39 NEW cov: 11893 ft: 14861 corp: 15/296b lim: 30 exec/s: 0 rss: 68Mb L: 10/28 MS: 1 EraseBytes- 00:07:38.887 [2024-11-29 05:32:50.129107] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (76692) > buf size (4096) 00:07:38.887 [2024-11-29 05:32:50.129451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:4ae400e4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.887 [2024-11-29 05:32:50.129483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.887 #41 NEW cov: 11893 ft: 14925 corp: 16/303b lim: 30 exec/s: 0 rss: 68Mb L: 7/28 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:38.887 [2024-11-29 05:32:50.169948] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10372) > buf size (4096) 00:07:38.887 [2024-11-29 05:32:50.170736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a200000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.887 [2024-11-29 05:32:50.170767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.887 [2024-11-29 05:32:50.170894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000f4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.887 [2024-11-29 05:32:50.170913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.887 [2024-11-29 05:32:50.171042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.887 [2024-11-29 05:32:50.171060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.887 [2024-11-29 05:32:50.171185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.887 [2024-11-29 05:32:50.171204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.145 #42 NEW cov: 11893 ft: 14937 corp: 17/329b lim: 30 exec/s: 0 rss: 68Mb L: 26/28 MS: 1 ChangeBit- 00:07:39.145 [2024-11-29 05:32:50.219794] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:39.145 [2024-11-29 05:32:50.220153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.220185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.145 #46 NEW cov: 11893 ft: 14953 corp: 18/339b lim: 30 exec/s: 46 rss: 68Mb L: 10/28 MS: 4 EraseBytes-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:39.145 [2024-11-29 05:32:50.280179] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:39.145 [2024-11-29 05:32:50.280333] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2 00:07:39.145 [2024-11-29 05:32:50.280952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.280983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.145 [2024-11-29 05:32:50.281121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000f4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.281139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.145 [2024-11-29 05:32:50.281268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.281287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.145 [2024-11-29 05:32:50.281408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.281427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.145 #47 NEW cov: 11893 ft: 14982 corp: 19/365b lim: 30 exec/s: 47 rss: 68Mb L: 26/28 MS: 1 ChangeBit- 00:07:39.145 [2024-11-29 05:32:50.330253] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002d3e 00:07:39.145 [2024-11-29 05:32:50.330422] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:39.145 [2024-11-29 05:32:50.330582] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:39.145 [2024-11-29 05:32:50.330954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3e3e0251 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.330985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.145 [2024-11-29 05:32:50.331114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.331134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.145 [2024-11-29 05:32:50.331257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.331275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.145 #48 NEW cov: 11893 ft: 15052 corp: 20/385b lim: 30 exec/s: 48 rss: 68Mb L: 20/28 MS: 1 ChangeByte- 00:07:39.145 [2024-11-29 05:32:50.390312] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (131076) > buf size (4096) 00:07:39.145 [2024-11-29 05:32:50.390693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:80000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.390724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.145 #49 NEW cov: 11893 ft: 15125 corp: 21/395b lim: 30 exec/s: 49 rss: 68Mb L: 10/28 MS: 1 ChangeBit- 00:07:39.145 [2024-11-29 05:32:50.441219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.441250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.145 [2024-11-29 05:32:50.441387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.441405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.145 [2024-11-29 05:32:50.441539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.145 [2024-11-29 05:32:50.441557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.403 #52 NEW cov: 11893 ft: 15132 corp: 22/414b lim: 30 exec/s: 52 rss: 68Mb L: 19/28 MS: 3 ChangeByte-ChangeBinInt-InsertRepeatedBytes- 00:07:39.403 [2024-11-29 05:32:50.490728] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002d3e 00:07:39.403 [2024-11-29 05:32:50.490914] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (588028) > buf size (4096) 00:07:39.403 [2024-11-29 05:32:50.491066] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:39.403 [2024-11-29 05:32:50.491486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.403 [2024-11-29 05:32:50.491517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.403 [2024-11-29 05:32:50.491643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.403 [2024-11-29 05:32:50.491661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.404 [2024-11-29 05:32:50.491788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.404 [2024-11-29 05:32:50.491806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.404 #53 NEW cov: 11893 ft: 15172 corp: 23/434b lim: 30 exec/s: 53 rss: 68Mb L: 20/28 MS: 1 CrossOver- 00:07:39.404 [2024-11-29 05:32:50.530453] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300001313 00:07:39.404 [2024-11-29 05:32:50.530628] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300001313 00:07:39.404 [2024-11-29 05:32:50.530983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:91918313 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.404 [2024-11-29 05:32:50.531012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.404 [2024-11-29 05:32:50.531142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:13138313 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.404 [2024-11-29 05:32:50.531161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.404 #57 NEW cov: 11893 ft: 15190 corp: 24/447b lim: 30 exec/s: 57 rss: 68Mb L: 13/28 MS: 4 ShuffleBytes-ChangeByte-CopyPart-InsertRepeatedBytes- 00:07:39.404 [2024-11-29 05:32:50.570971] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002d3e 00:07:39.404 [2024-11-29 05:32:50.571135] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3e3e 00:07:39.404 [2024-11-29 05:32:50.571296] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003e3e 00:07:39.404 [2024-11-29 05:32:50.571625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3e3e0251 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.404 [2024-11-29 05:32:50.571655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.404 [2024-11-29 05:32:50.571780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3e000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.404 [2024-11-29 05:32:50.571800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.404 [2024-11-29 05:32:50.571928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.404 [2024-11-29 05:32:50.571946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.404 #58 NEW cov: 11893 ft: 15210 corp: 25/470b lim: 30 exec/s: 58 rss: 68Mb L: 23/28 MS: 1 InsertRepeatedBytes- 00:07:39.404 [2024-11-29 05:32:50.620988] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300001313 00:07:39.404 [2024-11-29 05:32:50.621372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:91918313 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.404 [2024-11-29 05:32:50.621401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.404 #59 NEW cov: 11893 ft: 15225 corp: 26/480b lim: 30 exec/s: 59 rss: 68Mb L: 10/28 MS: 1 EraseBytes- 00:07:39.404 [2024-11-29 05:32:50.661893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.404 [2024-11-29 05:32:50.661920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.404 [2024-11-29 05:32:50.662048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.404 [2024-11-29 05:32:50.662065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.404 [2024-11-29 05:32:50.662193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.404 [2024-11-29 05:32:50.662209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.404 #60 NEW cov: 11893 ft: 15240 corp: 27/501b lim: 30 exec/s: 60 rss: 68Mb L: 21/28 MS: 1 CopyPart- 00:07:39.404 [2024-11-29 05:32:50.700769] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000293e 00:07:39.404 [2024-11-29 05:32:50.701102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3e3e023e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.404 [2024-11-29 05:32:50.701130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.662 #61 NEW cov: 11893 ft: 15290 corp: 28/511b lim: 30 exec/s: 61 rss: 68Mb L: 10/28 MS: 1 EraseBytes- 00:07:39.662 [2024-11-29 05:32:50.751472] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (411208) > buf size (4096) 00:07:39.662 [2024-11-29 05:32:50.751645] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xed13 00:07:39.662 [2024-11-29 05:32:50.751990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:91918113 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.752020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.662 [2024-11-29 05:32:50.752148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ecec00ec cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.752165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.662 #67 NEW cov: 11893 ft: 15293 corp: 29/524b lim: 30 exec/s: 67 rss: 68Mb L: 13/28 MS: 1 ChangeBinInt- 00:07:39.662 [2024-11-29 05:32:50.791819] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (201492) > buf size (4096) 00:07:39.662 [2024-11-29 05:32:50.792011] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (201492) > buf size (4096) 00:07:39.662 [2024-11-29 05:32:50.792171] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (201492) > buf size (4096) 00:07:39.662 [2024-11-29 05:32:50.792331] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (201492) > buf size (4096) 00:07:39.662 [2024-11-29 05:32:50.792702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:c4c400c4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.792733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.662 [2024-11-29 05:32:50.792857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:c4c400c4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.792877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.662 [2024-11-29 05:32:50.793001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:c4c400c4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.793020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.662 [2024-11-29 05:32:50.793141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:c4c400c4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.793161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.662 #68 NEW cov: 11893 ft: 15299 corp: 30/552b lim: 30 exec/s: 68 rss: 68Mb L: 28/28 MS: 1 ChangeBinInt- 00:07:39.662 [2024-11-29 05:32:50.841540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.841570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.662 #69 NEW cov: 11893 ft: 15306 corp: 31/559b lim: 30 exec/s: 69 rss: 68Mb L: 7/28 MS: 1 EraseBytes- 00:07:39.662 [2024-11-29 05:32:50.881997] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:39.662 [2024-11-29 05:32:50.882465] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff1a 00:07:39.662 [2024-11-29 05:32:50.882852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.882883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.662 [2024-11-29 05:32:50.883002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000f4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.883021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.662 [2024-11-29 05:32:50.883143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.883162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.662 [2024-11-29 05:32:50.883290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.883309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.662 #70 NEW cov: 11893 ft: 15328 corp: 32/585b lim: 30 exec/s: 70 rss: 68Mb L: 26/28 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:39.662 [2024-11-29 05:32:50.932220] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:39.662 [2024-11-29 05:32:50.932379] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2 00:07:39.662 [2024-11-29 05:32:50.932678] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (768) > len (4) 00:07:39.662 [2024-11-29 05:32:50.933000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.933030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.662 [2024-11-29 05:32:50.933158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000f4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.933174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.662 [2024-11-29 05:32:50.933306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.933326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.662 [2024-11-29 05:32:50.933457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.662 [2024-11-29 05:32:50.933474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.662 #71 NEW cov: 11899 ft: 15346 corp: 33/611b lim: 30 exec/s: 71 rss: 69Mb L: 26/28 MS: 1 ChangeBinInt- 00:07:39.921 [2024-11-29 05:32:50.981930] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300001313 00:07:39.921 [2024-11-29 05:32:50.982318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:91918313 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:50.982346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.921 #72 NEW cov: 11899 ft: 15359 corp: 34/621b lim: 30 exec/s: 72 rss: 69Mb L: 10/28 MS: 1 EraseBytes- 00:07:39.921 [2024-11-29 05:32:51.022475] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:39.921 [2024-11-29 05:32:51.022935] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff1a 00:07:39.921 [2024-11-29 05:32:51.023274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.023303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.921 [2024-11-29 05:32:51.023431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000f4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.023450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.921 [2024-11-29 05:32:51.023572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.023589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.921 [2024-11-29 05:32:51.023714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.023734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.921 #73 NEW cov: 11899 ft: 15368 corp: 35/647b lim: 30 exec/s: 73 rss: 69Mb L: 26/28 MS: 1 CopyPart- 00:07:39.921 [2024-11-29 05:32:51.072343] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff73 00:07:39.921 [2024-11-29 05:32:51.072686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.072716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.921 #74 NEW cov: 11899 ft: 15380 corp: 36/657b lim: 30 exec/s: 74 rss: 69Mb L: 10/28 MS: 1 ChangeByte- 00:07:39.921 [2024-11-29 05:32:51.112572] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:39.921 [2024-11-29 05:32:51.112752] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (261124) > buf size (4096) 00:07:39.921 [2024-11-29 05:32:51.113083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.113113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.921 [2024-11-29 05:32:51.113225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.113243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.921 [2024-11-29 05:32:51.152640] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:39.921 [2024-11-29 05:32:51.152802] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:39.921 [2024-11-29 05:32:51.153296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.153325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.921 [2024-11-29 05:32:51.153450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.153470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.921 [2024-11-29 05:32:51.153596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.153617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.921 #76 NEW cov: 11899 ft: 15393 corp: 37/677b lim: 30 exec/s: 76 rss: 69Mb L: 20/28 MS: 2 InsertRepeatedBytes-PersAutoDict- DE: "\377\377\377\377"- 00:07:39.921 [2024-11-29 05:32:51.192672] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100005555 00:07:39.921 [2024-11-29 05:32:51.192825] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100005555 00:07:39.921 [2024-11-29 05:32:51.192978] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100005555 00:07:39.921 [2024-11-29 05:32:51.193332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:55558155 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.193361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.921 [2024-11-29 05:32:51.193485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:55558155 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.193503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.921 [2024-11-29 05:32:51.193614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:55558155 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.921 [2024-11-29 05:32:51.193643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.922 #79 NEW cov: 11899 ft: 15409 corp: 38/697b lim: 30 exec/s: 39 rss: 69Mb L: 20/28 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:39.922 #79 DONE cov: 11899 ft: 15409 corp: 38/697b lim: 30 exec/s: 39 rss: 69Mb 00:07:39.922 ###### Recommended dictionary. ###### 00:07:39.922 "\377\377\377\377" # Uses: 2 00:07:39.922 ###### End of recommended dictionary. ###### 00:07:39.922 Done 79 runs in 2 second(s) 00:07:40.180 05:32:51 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:40.180 05:32:51 -- ../common.sh@72 -- # (( i++ )) 00:07:40.180 05:32:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.180 05:32:51 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:40.180 05:32:51 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:40.180 05:32:51 -- nvmf/run.sh@24 -- # local timen=1 00:07:40.180 05:32:51 -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.180 05:32:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:40.180 05:32:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:40.180 05:32:51 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:40.180 05:32:51 -- nvmf/run.sh@29 -- # port=4402 00:07:40.180 05:32:51 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:40.180 05:32:51 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:40.180 05:32:51 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.180 05:32:51 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:40.180 [2024-11-29 05:32:51.361933] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:40.180 [2024-11-29 05:32:51.362002] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2213745 ] 00:07:40.180 EAL: No free 2048 kB hugepages reported on node 1 00:07:40.439 [2024-11-29 05:32:51.610657] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.439 [2024-11-29 05:32:51.637451] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:40.439 [2024-11-29 05:32:51.637591] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.439 [2024-11-29 05:32:51.689041] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.439 [2024-11-29 05:32:51.705416] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:40.439 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.439 INFO: Seed: 3688184132 00:07:40.698 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:40.698 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:40.698 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:40.698 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.698 #2 INITED exec/s: 0 rss: 59Mb 00:07:40.698 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.698 This may also happen if the target rejected all inputs we tried so far 00:07:40.698 [2024-11-29 05:32:51.772247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:5858000a cdw11:58005858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.698 [2024-11-29 05:32:51.772287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.957 NEW_FUNC[1/669]: 0x454738 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:40.957 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.957 #4 NEW cov: 11573 ft: 11581 corp: 2/8b lim: 35 exec/s: 0 rss: 67Mb L: 7/7 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:40.957 [2024-11-29 05:32:52.092162] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.957 [2024-11-29 05:32:52.092343] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.957 [2024-11-29 05:32:52.092685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.957 [2024-11-29 05:32:52.092729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.957 [2024-11-29 05:32:52.092855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.957 [2024-11-29 05:32:52.092883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.957 [2024-11-29 05:32:52.093008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.957 [2024-11-29 05:32:52.093030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.957 NEW_FUNC[1/1]: 0x1290a68 in nvmf_transport_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:723 00:07:40.957 #6 NEW cov: 11706 ft: 12432 corp: 3/33b lim: 35 exec/s: 0 rss: 67Mb L: 25/25 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:40.957 [2024-11-29 05:32:52.132328] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.957 [2024-11-29 05:32:52.132517] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.957 [2024-11-29 05:32:52.132679] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.957 [2024-11-29 05:32:52.133006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.957 [2024-11-29 05:32:52.133041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.957 [2024-11-29 05:32:52.133158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.957 [2024-11-29 05:32:52.133179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.957 [2024-11-29 05:32:52.133307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.957 [2024-11-29 05:32:52.133329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.957 [2024-11-29 05:32:52.133453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.957 [2024-11-29 05:32:52.133477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.957 #7 NEW cov: 11712 ft: 13125 corp: 4/64b lim: 35 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 CopyPart- 00:07:40.957 [2024-11-29 05:32:52.172829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:8888000a cdw11:88008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.957 [2024-11-29 05:32:52.172858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.957 [2024-11-29 05:32:52.172990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:88880088 cdw11:88008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.957 [2024-11-29 05:32:52.173008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.957 [2024-11-29 05:32:52.173135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:88880088 cdw11:88008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.957 [2024-11-29 05:32:52.173151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.957 #10 NEW cov: 11797 ft: 13475 corp: 5/91b lim: 35 exec/s: 0 rss: 67Mb L: 27/31 MS: 3 CopyPart-CrossOver-InsertRepeatedBytes- 00:07:40.958 [2024-11-29 05:32:52.212428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3500000a cdw11:58000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.958 [2024-11-29 05:32:52.212457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.958 #11 NEW cov: 11797 ft: 13647 corp: 6/98b lim: 35 exec/s: 0 rss: 67Mb L: 7/31 MS: 1 CMP- DE: "5\000\000\000"- 00:07:40.958 [2024-11-29 05:32:52.252660] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.958 [2024-11-29 05:32:52.252839] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.958 [2024-11-29 05:32:52.253008] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.958 [2024-11-29 05:32:52.253367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.958 [2024-11-29 05:32:52.253396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.958 [2024-11-29 05:32:52.253508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.958 [2024-11-29 05:32:52.253528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.958 [2024-11-29 05:32:52.253646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.958 [2024-11-29 05:32:52.253673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.958 [2024-11-29 05:32:52.253800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.958 [2024-11-29 05:32:52.253821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.215 #12 NEW cov: 11797 ft: 13824 corp: 7/129b lim: 35 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 CopyPart- 00:07:41.215 [2024-11-29 05:32:52.302741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0200000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.215 [2024-11-29 05:32:52.302770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.215 #14 NEW cov: 11797 ft: 13945 corp: 8/141b lim: 35 exec/s: 0 rss: 67Mb L: 12/31 MS: 2 CMP-InsertRepeatedBytes- DE: "\002\000\000\000"- 00:07:41.216 [2024-11-29 05:32:52.342772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:5858000a cdw11:0000580a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.216 [2024-11-29 05:32:52.342800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.216 #15 NEW cov: 11797 ft: 13976 corp: 9/153b lim: 35 exec/s: 0 rss: 67Mb L: 12/31 MS: 1 CrossOver- 00:07:41.216 [2024-11-29 05:32:52.383380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:8888000a cdw11:88008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.216 [2024-11-29 05:32:52.383408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.216 [2024-11-29 05:32:52.383526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:88880088 cdw11:81008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.216 [2024-11-29 05:32:52.383545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.216 [2024-11-29 05:32:52.383674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:77880077 cdw11:88008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.216 [2024-11-29 05:32:52.383692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.216 #16 NEW cov: 11797 ft: 14054 corp: 10/180b lim: 35 exec/s: 0 rss: 67Mb L: 27/31 MS: 1 ChangeBinInt- 00:07:41.216 [2024-11-29 05:32:52.423634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:8888000a cdw11:88008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.216 [2024-11-29 05:32:52.423663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.216 [2024-11-29 05:32:52.423788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:88880088 cdw11:88008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.216 [2024-11-29 05:32:52.423806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.216 [2024-11-29 05:32:52.423923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:88880088 cdw11:88008888 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.216 [2024-11-29 05:32:52.423940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.216 #17 NEW cov: 11797 ft: 14095 corp: 11/207b lim: 35 exec/s: 0 rss: 67Mb L: 27/31 MS: 1 ChangeBit- 00:07:41.216 [2024-11-29 05:32:52.463245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:35003500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.216 [2024-11-29 05:32:52.463272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.216 #26 NEW cov: 11797 ft: 14118 corp: 12/214b lim: 35 exec/s: 0 rss: 67Mb L: 7/31 MS: 4 PersAutoDict-EraseBytes-ShuffleBytes-CopyPart- DE: "5\000\000\000"- 00:07:41.216 [2024-11-29 05:32:52.493328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0035000a cdw11:35000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.216 [2024-11-29 05:32:52.493354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.474 #27 NEW cov: 11797 ft: 14170 corp: 13/221b lim: 35 exec/s: 0 rss: 67Mb L: 7/31 MS: 1 ShuffleBytes- 00:07:41.474 [2024-11-29 05:32:52.533552] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.474 [2024-11-29 05:32:52.533744] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.474 [2024-11-29 05:32:52.533899] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.474 [2024-11-29 05:32:52.534231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.534259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.474 [2024-11-29 05:32:52.534376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.534399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.474 [2024-11-29 05:32:52.534509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.534527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.474 [2024-11-29 05:32:52.534636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.534657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.474 #28 NEW cov: 11797 ft: 14208 corp: 14/252b lim: 35 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 ChangeBinInt- 00:07:41.474 [2024-11-29 05:32:52.573812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.573840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.474 [2024-11-29 05:32:52.573954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42000042 cdw11:00000035 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.573972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.474 #29 NEW cov: 11797 ft: 14406 corp: 15/267b lim: 35 exec/s: 0 rss: 67Mb L: 15/31 MS: 1 InsertRepeatedBytes- 00:07:41.474 [2024-11-29 05:32:52.613709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:b5003500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.613735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.474 #30 NEW cov: 11797 ft: 14419 corp: 16/274b lim: 35 exec/s: 0 rss: 67Mb L: 7/31 MS: 1 ChangeBit- 00:07:41.474 [2024-11-29 05:32:52.643897] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.474 [2024-11-29 05:32:52.644066] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.474 [2024-11-29 05:32:52.644216] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.474 [2024-11-29 05:32:52.644556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.644585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.474 [2024-11-29 05:32:52.644700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.644718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.474 [2024-11-29 05:32:52.644828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.644851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.474 [2024-11-29 05:32:52.644960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.644981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.474 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:41.474 #31 NEW cov: 11820 ft: 14520 corp: 17/308b lim: 35 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 CopyPart- 00:07:41.474 [2024-11-29 05:32:52.693981] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.474 [2024-11-29 05:32:52.694153] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.474 [2024-11-29 05:32:52.694498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.694526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.474 [2024-11-29 05:32:52.694644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.694668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.474 [2024-11-29 05:32:52.694793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.474 [2024-11-29 05:32:52.694819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.474 #32 NEW cov: 11820 ft: 14558 corp: 18/330b lim: 35 exec/s: 0 rss: 68Mb L: 22/34 MS: 1 EraseBytes- 00:07:41.474 [2024-11-29 05:32:52.734016] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.475 [2024-11-29 05:32:52.734382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3500000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.475 [2024-11-29 05:32:52.734411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.475 [2024-11-29 05:32:52.734542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:58000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.475 [2024-11-29 05:32:52.734566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.475 #33 NEW cov: 11820 ft: 14616 corp: 19/344b lim: 35 exec/s: 33 rss: 68Mb L: 14/34 MS: 1 InsertRepeatedBytes- 00:07:41.475 [2024-11-29 05:32:52.774211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:5858000a cdw11:5800580a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.475 [2024-11-29 05:32:52.774241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.733 #34 NEW cov: 11820 ft: 14693 corp: 20/355b lim: 35 exec/s: 34 rss: 68Mb L: 11/34 MS: 1 CopyPart- 00:07:41.733 [2024-11-29 05:32:52.814381] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.733 [2024-11-29 05:32:52.814565] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.733 [2024-11-29 05:32:52.814727] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.733 [2024-11-29 05:32:52.815100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.733 [2024-11-29 05:32:52.815130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.733 [2024-11-29 05:32:52.815257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.733 [2024-11-29 05:32:52.815282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.733 [2024-11-29 05:32:52.815407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.733 [2024-11-29 05:32:52.815431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.733 [2024-11-29 05:32:52.815564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.733 [2024-11-29 05:32:52.815587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.733 #35 NEW cov: 11820 ft: 14702 corp: 21/386b lim: 35 exec/s: 35 rss: 68Mb L: 31/34 MS: 1 ChangeBinInt- 00:07:41.733 [2024-11-29 05:32:52.854432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4000000a cdw11:35003500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.733 [2024-11-29 05:32:52.854459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.733 #36 NEW cov: 11820 ft: 14719 corp: 22/393b lim: 35 exec/s: 36 rss: 68Mb L: 7/34 MS: 1 ChangeBit- 00:07:41.733 [2024-11-29 05:32:52.894537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:5858000a cdw11:58005858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.733 [2024-11-29 05:32:52.894567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.733 #37 NEW cov: 11820 ft: 14726 corp: 23/400b lim: 35 exec/s: 37 rss: 68Mb L: 7/34 MS: 1 ShuffleBytes- 00:07:41.733 [2024-11-29 05:32:52.934745] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.733 [2024-11-29 05:32:52.934935] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.733 [2024-11-29 05:32:52.935081] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.733 [2024-11-29 05:32:52.935431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000004 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.733 [2024-11-29 05:32:52.935462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.733 [2024-11-29 05:32:52.935583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.733 [2024-11-29 05:32:52.935604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.733 [2024-11-29 05:32:52.935728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.733 [2024-11-29 05:32:52.935753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.733 [2024-11-29 05:32:52.935882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.733 [2024-11-29 05:32:52.935905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.733 #38 NEW cov: 11820 ft: 14739 corp: 24/431b lim: 35 exec/s: 38 rss: 68Mb L: 31/34 MS: 1 ChangeBit- 00:07:41.733 [2024-11-29 05:32:52.984863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:96003500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.733 [2024-11-29 05:32:52.984891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.733 #39 NEW cov: 11820 ft: 14745 corp: 25/439b lim: 35 exec/s: 39 rss: 68Mb L: 8/34 MS: 1 InsertByte- 00:07:41.991 [2024-11-29 05:32:53.035222] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.991 [2024-11-29 05:32:53.035391] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.991 [2024-11-29 05:32:53.035534] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.992 [2024-11-29 05:32:53.035700] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.992 [2024-11-29 05:32:53.036063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.036096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.992 [2024-11-29 05:32:53.036226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.036252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.992 [2024-11-29 05:32:53.036380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.036404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.992 [2024-11-29 05:32:53.036524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.036547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.992 [2024-11-29 05:32:53.036673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:35003500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.036697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.992 #40 NEW cov: 11820 ft: 14884 corp: 26/474b lim: 35 exec/s: 40 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:41.992 [2024-11-29 05:32:53.085184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4000000a cdw11:75003500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.085214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.992 #41 NEW cov: 11820 ft: 14891 corp: 27/481b lim: 35 exec/s: 41 rss: 68Mb L: 7/35 MS: 1 ChangeBit- 00:07:41.992 [2024-11-29 05:32:53.125354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:b500c600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.125387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.992 #42 NEW cov: 11820 ft: 14903 corp: 28/488b lim: 35 exec/s: 42 rss: 68Mb L: 7/35 MS: 1 ChangeBinInt- 00:07:41.992 [2024-11-29 05:32:53.165146] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.992 [2024-11-29 05:32:53.165487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:75000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.165522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.992 #43 NEW cov: 11820 ft: 14934 corp: 29/495b lim: 35 exec/s: 43 rss: 68Mb L: 7/35 MS: 1 ChangeBinInt- 00:07:41.992 [2024-11-29 05:32:53.205858] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.992 [2024-11-29 05:32:53.206019] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.992 [2024-11-29 05:32:53.206174] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.992 [2024-11-29 05:32:53.206541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000004 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.206570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.992 [2024-11-29 05:32:53.206680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:88880088 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.206699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.992 [2024-11-29 05:32:53.206823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.206848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.992 [2024-11-29 05:32:53.206962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:1f000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.206984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.992 [2024-11-29 05:32:53.207100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.207124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.992 #44 NEW cov: 11820 ft: 14941 corp: 30/530b lim: 35 exec/s: 44 rss: 68Mb L: 35/35 MS: 1 CrossOver- 00:07:41.992 [2024-11-29 05:32:53.255849] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.992 [2024-11-29 05:32:53.256024] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.992 [2024-11-29 05:32:53.256186] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:41.992 [2024-11-29 05:32:53.256549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.256579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.992 [2024-11-29 05:32:53.256698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.256722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.992 [2024-11-29 05:32:53.256851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.256874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.992 [2024-11-29 05:32:53.257000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.992 [2024-11-29 05:32:53.257021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.992 #45 NEW cov: 11820 ft: 15033 corp: 31/564b lim: 35 exec/s: 45 rss: 68Mb L: 34/35 MS: 1 ChangeBit- 00:07:42.250 [2024-11-29 05:32:53.306038] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.250 [2024-11-29 05:32:53.306205] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.250 [2024-11-29 05:32:53.306578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.250 [2024-11-29 05:32:53.306614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.250 [2024-11-29 05:32:53.306735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.250 [2024-11-29 05:32:53.306755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.250 [2024-11-29 05:32:53.306875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.250 [2024-11-29 05:32:53.306896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.250 #46 NEW cov: 11820 ft: 15042 corp: 32/590b lim: 35 exec/s: 46 rss: 68Mb L: 26/35 MS: 1 CrossOver- 00:07:42.250 [2024-11-29 05:32:53.346155] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.250 [2024-11-29 05:32:53.346324] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.250 [2024-11-29 05:32:53.346482] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.250 [2024-11-29 05:32:53.346846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.250 [2024-11-29 05:32:53.346877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.250 [2024-11-29 05:32:53.347010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.250 [2024-11-29 05:32:53.347035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.250 [2024-11-29 05:32:53.347158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.250 [2024-11-29 05:32:53.347183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.250 [2024-11-29 05:32:53.347311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.250 [2024-11-29 05:32:53.347339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.250 #47 NEW cov: 11820 ft: 15043 corp: 33/624b lim: 35 exec/s: 47 rss: 68Mb L: 34/35 MS: 1 CopyPart- 00:07:42.250 [2024-11-29 05:32:53.386253] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.250 [2024-11-29 05:32:53.386429] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.250 [2024-11-29 05:32:53.386584] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.250 [2024-11-29 05:32:53.386942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:09000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.250 [2024-11-29 05:32:53.386970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.250 [2024-11-29 05:32:53.387093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.250 [2024-11-29 05:32:53.387115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.250 [2024-11-29 05:32:53.387234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.250 [2024-11-29 05:32:53.387259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.251 [2024-11-29 05:32:53.387389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.251 [2024-11-29 05:32:53.387410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.251 #48 NEW cov: 11820 ft: 15070 corp: 34/655b lim: 35 exec/s: 48 rss: 68Mb L: 31/35 MS: 1 ChangeBinInt- 00:07:42.251 [2024-11-29 05:32:53.436441] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.251 [2024-11-29 05:32:53.436630] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.251 [2024-11-29 05:32:53.436781] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.251 [2024-11-29 05:32:53.437141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.251 [2024-11-29 05:32:53.437171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.251 [2024-11-29 05:32:53.437287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.251 [2024-11-29 05:32:53.437311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.251 [2024-11-29 05:32:53.437435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.251 [2024-11-29 05:32:53.437460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.251 [2024-11-29 05:32:53.437579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.251 [2024-11-29 05:32:53.437604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.251 #49 NEW cov: 11820 ft: 15082 corp: 35/686b lim: 35 exec/s: 49 rss: 68Mb L: 31/35 MS: 1 ChangeBinInt- 00:07:42.251 [2024-11-29 05:32:53.476564] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.251 [2024-11-29 05:32:53.476755] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.251 [2024-11-29 05:32:53.476916] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.251 [2024-11-29 05:32:53.477276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.251 [2024-11-29 05:32:53.477305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.251 [2024-11-29 05:32:53.477416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.251 [2024-11-29 05:32:53.477439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.251 [2024-11-29 05:32:53.477560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.251 [2024-11-29 05:32:53.477580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.251 [2024-11-29 05:32:53.477701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.251 [2024-11-29 05:32:53.477722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.251 #50 NEW cov: 11820 ft: 15083 corp: 36/720b lim: 35 exec/s: 50 rss: 68Mb L: 34/35 MS: 1 ShuffleBytes- 00:07:42.251 [2024-11-29 05:32:53.526605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:5858000a cdw11:5800580a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.251 [2024-11-29 05:32:53.526632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.509 #51 NEW cov: 11820 ft: 15085 corp: 37/727b lim: 35 exec/s: 51 rss: 69Mb L: 7/35 MS: 1 EraseBytes- 00:07:42.509 [2024-11-29 05:32:53.566621] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.509 [2024-11-29 05:32:53.566792] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.509 [2024-11-29 05:32:53.566950] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.509 [2024-11-29 05:32:53.567303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.509 [2024-11-29 05:32:53.567331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.509 [2024-11-29 05:32:53.567455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.509 [2024-11-29 05:32:53.567477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.509 [2024-11-29 05:32:53.567612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.509 [2024-11-29 05:32:53.567633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.509 [2024-11-29 05:32:53.567750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:35000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.509 [2024-11-29 05:32:53.567772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.509 #52 NEW cov: 11820 ft: 15109 corp: 38/759b lim: 35 exec/s: 52 rss: 69Mb L: 32/35 MS: 1 CrossOver- 00:07:42.509 [2024-11-29 05:32:53.606933] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.509 [2024-11-29 05:32:53.607104] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.509 [2024-11-29 05:32:53.607252] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.509 [2024-11-29 05:32:53.607592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000004 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.607623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.510 [2024-11-29 05:32:53.607745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.607768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.510 [2024-11-29 05:32:53.607899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.607917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.510 [2024-11-29 05:32:53.608051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1f000000 cdw11:07000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.608073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.510 #53 NEW cov: 11820 ft: 15116 corp: 39/790b lim: 35 exec/s: 53 rss: 69Mb L: 31/35 MS: 1 CMP- DE: "\007\000\000\000"- 00:07:42.510 [2024-11-29 05:32:53.647131] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.510 [2024-11-29 05:32:53.647491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0200000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.647520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.510 [2024-11-29 05:32:53.647640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:35000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.647660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.510 [2024-11-29 05:32:53.647779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.647802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.510 #54 NEW cov: 11820 ft: 15128 corp: 40/815b lim: 35 exec/s: 54 rss: 69Mb L: 25/35 MS: 1 CrossOver- 00:07:42.510 [2024-11-29 05:32:53.687124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:58000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.687150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.510 #55 NEW cov: 11820 ft: 15179 corp: 41/823b lim: 35 exec/s: 55 rss: 69Mb L: 8/35 MS: 1 EraseBytes- 00:07:42.510 [2024-11-29 05:32:53.727166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4000000a cdw11:43003500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.727192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.510 #56 NEW cov: 11820 ft: 15185 corp: 42/835b lim: 35 exec/s: 56 rss: 69Mb L: 12/35 MS: 1 InsertRepeatedBytes- 00:07:42.510 [2024-11-29 05:32:53.767326] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.510 [2024-11-29 05:32:53.767489] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.510 [2024-11-29 05:32:53.767619] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:42.510 [2024-11-29 05:32:53.767995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.768023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.510 [2024-11-29 05:32:53.768137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.768159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.510 [2024-11-29 05:32:53.768282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.768304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.510 [2024-11-29 05:32:53.768429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00350000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.510 [2024-11-29 05:32:53.768452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.510 #57 NEW cov: 11820 ft: 15188 corp: 43/868b lim: 35 exec/s: 28 rss: 69Mb L: 33/35 MS: 1 InsertByte- 00:07:42.510 #57 DONE cov: 11820 ft: 15188 corp: 43/868b lim: 35 exec/s: 28 rss: 69Mb 00:07:42.510 ###### Recommended dictionary. ###### 00:07:42.510 "5\000\000\000" # Uses: 1 00:07:42.510 "\002\000\000\000" # Uses: 0 00:07:42.510 "\007\000\000\000" # Uses: 0 00:07:42.510 ###### End of recommended dictionary. ###### 00:07:42.510 Done 57 runs in 2 second(s) 00:07:42.768 05:32:53 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:42.768 05:32:53 -- ../common.sh@72 -- # (( i++ )) 00:07:42.768 05:32:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.768 05:32:53 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:42.768 05:32:53 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:42.768 05:32:53 -- nvmf/run.sh@24 -- # local timen=1 00:07:42.768 05:32:53 -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.768 05:32:53 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:42.768 05:32:53 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:42.768 05:32:53 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:42.768 05:32:53 -- nvmf/run.sh@29 -- # port=4403 00:07:42.768 05:32:53 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:42.768 05:32:53 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:42.768 05:32:53 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.768 05:32:53 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:42.768 [2024-11-29 05:32:53.954622] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:42.768 [2024-11-29 05:32:53.954688] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2214282 ] 00:07:42.768 EAL: No free 2048 kB hugepages reported on node 1 00:07:43.026 [2024-11-29 05:32:54.203746] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.026 [2024-11-29 05:32:54.231098] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:43.026 [2024-11-29 05:32:54.231221] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.026 [2024-11-29 05:32:54.282728] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.026 [2024-11-29 05:32:54.299105] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:43.026 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.026 INFO: Seed: 1986225142 00:07:43.284 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:43.284 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:43.284 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:43.284 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.284 #2 INITED exec/s: 0 rss: 59Mb 00:07:43.284 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.284 This may also happen if the target rejected all inputs we tried so far 00:07:43.541 NEW_FUNC[1/659]: 0x456418 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:43.541 NEW_FUNC[2/659]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:43.541 #11 NEW cov: 11506 ft: 11507 corp: 2/17b lim: 20 exec/s: 0 rss: 67Mb L: 16/16 MS: 4 ChangeByte-CrossOver-ChangeByte-InsertRepeatedBytes- 00:07:43.541 #17 NEW cov: 11619 ft: 11896 corp: 3/33b lim: 20 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 ChangeByte- 00:07:43.541 #18 NEW cov: 11625 ft: 12139 corp: 4/49b lim: 20 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 CrossOver- 00:07:43.541 #19 NEW cov: 11710 ft: 12521 corp: 5/66b lim: 20 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 CrossOver- 00:07:43.541 #20 NEW cov: 11710 ft: 12695 corp: 6/84b lim: 20 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 InsertByte- 00:07:43.799 #21 NEW cov: 11710 ft: 12785 corp: 7/100b lim: 20 exec/s: 0 rss: 67Mb L: 16/18 MS: 1 ChangeByte- 00:07:43.799 #22 NEW cov: 11710 ft: 12844 corp: 8/117b lim: 20 exec/s: 0 rss: 67Mb L: 17/18 MS: 1 ShuffleBytes- 00:07:43.799 #23 NEW cov: 11710 ft: 12894 corp: 9/133b lim: 20 exec/s: 0 rss: 67Mb L: 16/18 MS: 1 ChangeBit- 00:07:43.799 #24 NEW cov: 11715 ft: 13388 corp: 10/144b lim: 20 exec/s: 0 rss: 67Mb L: 11/18 MS: 1 EraseBytes- 00:07:43.799 #25 NEW cov: 11715 ft: 13403 corp: 11/160b lim: 20 exec/s: 0 rss: 67Mb L: 16/18 MS: 1 ChangeBinInt- 00:07:43.799 #26 NEW cov: 11715 ft: 13415 corp: 12/176b lim: 20 exec/s: 0 rss: 67Mb L: 16/18 MS: 1 ChangeBit- 00:07:43.799 #27 NEW cov: 11715 ft: 13499 corp: 13/196b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CopyPart- 00:07:44.056 #28 NEW cov: 11715 ft: 13537 corp: 14/214b lim: 20 exec/s: 0 rss: 68Mb L: 18/20 MS: 1 ChangeBit- 00:07:44.056 #29 NEW cov: 11715 ft: 13580 corp: 15/233b lim: 20 exec/s: 0 rss: 68Mb L: 19/20 MS: 1 InsertByte- 00:07:44.056 #30 NEW cov: 11715 ft: 13603 corp: 16/253b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:44.056 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:44.056 #31 NEW cov: 11738 ft: 13643 corp: 17/272b lim: 20 exec/s: 0 rss: 68Mb L: 19/20 MS: 1 CrossOver- 00:07:44.056 #32 NEW cov: 11738 ft: 13726 corp: 18/289b lim: 20 exec/s: 0 rss: 68Mb L: 17/20 MS: 1 CrossOver- 00:07:44.056 #33 NEW cov: 11738 ft: 13730 corp: 19/309b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:44.056 #34 NEW cov: 11738 ft: 13736 corp: 20/327b lim: 20 exec/s: 34 rss: 68Mb L: 18/20 MS: 1 CrossOver- 00:07:44.312 #35 NEW cov: 11738 ft: 13755 corp: 21/345b lim: 20 exec/s: 35 rss: 68Mb L: 18/20 MS: 1 CopyPart- 00:07:44.312 #36 NEW cov: 11738 ft: 13773 corp: 22/365b lim: 20 exec/s: 36 rss: 68Mb L: 20/20 MS: 1 ChangeBit- 00:07:44.313 #37 NEW cov: 11738 ft: 13797 corp: 23/385b lim: 20 exec/s: 37 rss: 68Mb L: 20/20 MS: 1 CopyPart- 00:07:44.313 #38 NEW cov: 11738 ft: 13822 corp: 24/404b lim: 20 exec/s: 38 rss: 68Mb L: 19/20 MS: 1 ChangeBit- 00:07:44.313 #39 NEW cov: 11738 ft: 14113 corp: 25/410b lim: 20 exec/s: 39 rss: 68Mb L: 6/20 MS: 1 EraseBytes- 00:07:44.313 #40 NEW cov: 11738 ft: 14152 corp: 26/419b lim: 20 exec/s: 40 rss: 69Mb L: 9/20 MS: 1 CrossOver- 00:07:44.570 #41 NEW cov: 11738 ft: 14181 corp: 27/430b lim: 20 exec/s: 41 rss: 69Mb L: 11/20 MS: 1 EraseBytes- 00:07:44.570 #42 NEW cov: 11738 ft: 14196 corp: 28/446b lim: 20 exec/s: 42 rss: 69Mb L: 16/20 MS: 1 ShuffleBytes- 00:07:44.570 #43 NEW cov: 11738 ft: 14205 corp: 29/466b lim: 20 exec/s: 43 rss: 69Mb L: 20/20 MS: 1 ChangeBit- 00:07:44.570 #44 NEW cov: 11738 ft: 14215 corp: 30/482b lim: 20 exec/s: 44 rss: 69Mb L: 16/20 MS: 1 ShuffleBytes- 00:07:44.570 #45 NEW cov: 11742 ft: 14336 corp: 31/495b lim: 20 exec/s: 45 rss: 69Mb L: 13/20 MS: 1 EraseBytes- 00:07:44.570 #46 NEW cov: 11742 ft: 14349 corp: 32/514b lim: 20 exec/s: 46 rss: 69Mb L: 19/20 MS: 1 InsertRepeatedBytes- 00:07:44.570 #47 NEW cov: 11742 ft: 14377 corp: 33/534b lim: 20 exec/s: 47 rss: 69Mb L: 20/20 MS: 1 ChangeBit- 00:07:44.828 #48 NEW cov: 11742 ft: 14450 corp: 34/550b lim: 20 exec/s: 48 rss: 69Mb L: 16/20 MS: 1 ChangeBinInt- 00:07:44.828 #49 NEW cov: 11742 ft: 14544 corp: 35/568b lim: 20 exec/s: 49 rss: 69Mb L: 18/20 MS: 1 ChangeBinInt- 00:07:44.828 #50 NEW cov: 11742 ft: 14546 corp: 36/584b lim: 20 exec/s: 50 rss: 69Mb L: 16/20 MS: 1 CopyPart- 00:07:44.828 #51 NEW cov: 11742 ft: 14551 corp: 37/600b lim: 20 exec/s: 51 rss: 69Mb L: 16/20 MS: 1 ChangeByte- 00:07:44.828 #52 NEW cov: 11742 ft: 14572 corp: 38/619b lim: 20 exec/s: 52 rss: 69Mb L: 19/20 MS: 1 CopyPart- 00:07:44.828 #53 NEW cov: 11742 ft: 14577 corp: 39/629b lim: 20 exec/s: 53 rss: 69Mb L: 10/20 MS: 1 EraseBytes- 00:07:45.087 #54 NEW cov: 11742 ft: 14586 corp: 40/649b lim: 20 exec/s: 54 rss: 69Mb L: 20/20 MS: 1 InsertByte- 00:07:45.087 #55 NEW cov: 11742 ft: 14601 corp: 41/658b lim: 20 exec/s: 55 rss: 69Mb L: 9/20 MS: 1 ShuffleBytes- 00:07:45.087 #56 NEW cov: 11742 ft: 14628 corp: 42/676b lim: 20 exec/s: 56 rss: 69Mb L: 18/20 MS: 1 ChangeBit- 00:07:45.087 #57 NEW cov: 11742 ft: 14659 corp: 43/696b lim: 20 exec/s: 57 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:45.087 #58 NEW cov: 11742 ft: 14667 corp: 44/716b lim: 20 exec/s: 58 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:45.087 #59 NEW cov: 11742 ft: 14681 corp: 45/734b lim: 20 exec/s: 29 rss: 70Mb L: 18/20 MS: 1 ChangeBinInt- 00:07:45.087 #59 DONE cov: 11742 ft: 14681 corp: 45/734b lim: 20 exec/s: 29 rss: 70Mb 00:07:45.087 Done 59 runs in 2 second(s) 00:07:45.347 05:32:56 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:45.347 05:32:56 -- ../common.sh@72 -- # (( i++ )) 00:07:45.347 05:32:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.347 05:32:56 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:45.347 05:32:56 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:45.347 05:32:56 -- nvmf/run.sh@24 -- # local timen=1 00:07:45.347 05:32:56 -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.347 05:32:56 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:45.347 05:32:56 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:45.347 05:32:56 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:45.347 05:32:56 -- nvmf/run.sh@29 -- # port=4404 00:07:45.347 05:32:56 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:45.347 05:32:56 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:45.347 05:32:56 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.347 05:32:56 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:45.347 [2024-11-29 05:32:56.499142] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:45.347 [2024-11-29 05:32:56.499230] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2214795 ] 00:07:45.347 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.606 [2024-11-29 05:32:56.752978] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.606 [2024-11-29 05:32:56.781162] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:45.606 [2024-11-29 05:32:56.781279] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.606 [2024-11-29 05:32:56.832493] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.606 [2024-11-29 05:32:56.848848] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:45.606 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.606 INFO: Seed: 243249900 00:07:45.606 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:45.606 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:45.606 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:45.606 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.606 #2 INITED exec/s: 0 rss: 59Mb 00:07:45.606 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.606 This may also happen if the target rejected all inputs we tried so far 00:07:45.606 [2024-11-29 05:32:56.904654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.606 [2024-11-29 05:32:56.904683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.606 [2024-11-29 05:32:56.904739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.606 [2024-11-29 05:32:56.904754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.606 [2024-11-29 05:32:56.904809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.606 [2024-11-29 05:32:56.904821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.606 [2024-11-29 05:32:56.904880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.606 [2024-11-29 05:32:56.904894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.124 NEW_FUNC[1/671]: 0x457518 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:46.124 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:46.124 #11 NEW cov: 11604 ft: 11606 corp: 2/34b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 4 CopyPart-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:07:46.124 [2024-11-29 05:32:57.225362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.225394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.124 [2024-11-29 05:32:57.225449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.225463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.124 [2024-11-29 05:32:57.225517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.225531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.124 [2024-11-29 05:32:57.225584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.225601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.124 #27 NEW cov: 11718 ft: 12166 corp: 3/67b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeBinInt- 00:07:46.124 [2024-11-29 05:32:57.275457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.275487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.124 [2024-11-29 05:32:57.275543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.275557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.124 [2024-11-29 05:32:57.275607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.275621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.124 [2024-11-29 05:32:57.275674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.275688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.124 #28 NEW cov: 11724 ft: 12416 corp: 4/100b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ShuffleBytes- 00:07:46.124 [2024-11-29 05:32:57.315501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.315528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.124 [2024-11-29 05:32:57.315584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.315601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.124 [2024-11-29 05:32:57.315655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.315669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.124 [2024-11-29 05:32:57.315724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.315737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.124 #29 NEW cov: 11809 ft: 12730 corp: 5/131b lim: 35 exec/s: 0 rss: 68Mb L: 31/33 MS: 1 EraseBytes- 00:07:46.124 [2024-11-29 05:32:57.355454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2d2d2d2d cdw11:2d2d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.355480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.124 [2024-11-29 05:32:57.355536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2d2d2d2d cdw11:2d2d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.355550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.124 [2024-11-29 05:32:57.355607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2d2d2d2d cdw11:2d2d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.355621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.124 #31 NEW cov: 11809 ft: 13216 corp: 6/158b lim: 35 exec/s: 0 rss: 68Mb L: 27/33 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:46.124 [2024-11-29 05:32:57.395267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b2b2b2b2 cdw11:b2b20001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.124 [2024-11-29 05:32:57.395295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.124 #32 NEW cov: 11809 ft: 14075 corp: 7/169b lim: 35 exec/s: 0 rss: 68Mb L: 11/33 MS: 1 InsertRepeatedBytes- 00:07:46.384 [2024-11-29 05:32:57.435655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.435680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.384 [2024-11-29 05:32:57.435737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.435751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.384 [2024-11-29 05:32:57.435806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.435819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.384 #38 NEW cov: 11809 ft: 14177 corp: 8/196b lim: 35 exec/s: 0 rss: 68Mb L: 27/33 MS: 1 CrossOver- 00:07:46.384 [2024-11-29 05:32:57.475993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.476018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.384 [2024-11-29 05:32:57.476072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.476086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.384 [2024-11-29 05:32:57.476140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e41c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.476154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.384 [2024-11-29 05:32:57.476207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1b1b1b1b cdw11:1b250003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.476220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.384 #39 NEW cov: 11809 ft: 14265 corp: 9/229b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeBinInt- 00:07:46.384 [2024-11-29 05:32:57.516110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e5e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.516135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.384 [2024-11-29 05:32:57.516190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.516203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.384 [2024-11-29 05:32:57.516257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.516270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.384 [2024-11-29 05:32:57.516321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.516337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.384 #40 NEW cov: 11809 ft: 14310 corp: 10/262b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeBit- 00:07:46.384 [2024-11-29 05:32:57.556246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.384 [2024-11-29 05:32:57.556273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.385 [2024-11-29 05:32:57.556326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.385 [2024-11-29 05:32:57.556341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.385 [2024-11-29 05:32:57.556395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.385 [2024-11-29 05:32:57.556409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.385 [2024-11-29 05:32:57.556462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.385 [2024-11-29 05:32:57.556476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.385 #41 NEW cov: 11809 ft: 14353 corp: 11/295b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeBit- 00:07:46.385 [2024-11-29 05:32:57.596387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.385 [2024-11-29 05:32:57.596413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.385 [2024-11-29 05:32:57.596468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.385 [2024-11-29 05:32:57.596483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.385 [2024-11-29 05:32:57.596537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.385 [2024-11-29 05:32:57.596551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.385 [2024-11-29 05:32:57.596607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.385 [2024-11-29 05:32:57.596620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.385 #42 NEW cov: 11809 ft: 14410 corp: 12/325b lim: 35 exec/s: 0 rss: 68Mb L: 30/33 MS: 1 EraseBytes- 00:07:46.385 [2024-11-29 05:32:57.636042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b2b2b2b2 cdw11:b2b20001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.385 [2024-11-29 05:32:57.636068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.385 #43 NEW cov: 11809 ft: 14453 corp: 13/336b lim: 35 exec/s: 0 rss: 68Mb L: 11/33 MS: 1 CopyPart- 00:07:46.385 [2024-11-29 05:32:57.676476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.385 [2024-11-29 05:32:57.676502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.385 [2024-11-29 05:32:57.676561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.385 [2024-11-29 05:32:57.676575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.385 [2024-11-29 05:32:57.676622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e425e4e4 cdw11:1be40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.385 [2024-11-29 05:32:57.676636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.644 #44 NEW cov: 11809 ft: 14540 corp: 14/357b lim: 35 exec/s: 0 rss: 68Mb L: 21/33 MS: 1 EraseBytes- 00:07:46.644 [2024-11-29 05:32:57.716718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.716744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.644 [2024-11-29 05:32:57.716800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.716814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.644 [2024-11-29 05:32:57.716866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.716879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.644 [2024-11-29 05:32:57.716932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.716945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.644 #45 NEW cov: 11809 ft: 14563 corp: 15/391b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 InsertByte- 00:07:46.644 [2024-11-29 05:32:57.756844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e5e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.756869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.644 [2024-11-29 05:32:57.756905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.756918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.644 [2024-11-29 05:32:57.756976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.756990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.644 [2024-11-29 05:32:57.757044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.757057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.644 #46 NEW cov: 11809 ft: 14588 corp: 16/425b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 InsertByte- 00:07:46.644 [2024-11-29 05:32:57.796951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.796976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.644 [2024-11-29 05:32:57.797036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.797049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.644 [2024-11-29 05:32:57.797103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.797117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.644 [2024-11-29 05:32:57.797172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.644 [2024-11-29 05:32:57.797186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.644 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:46.645 #47 NEW cov: 11832 ft: 14619 corp: 17/455b lim: 35 exec/s: 0 rss: 69Mb L: 30/34 MS: 1 EraseBytes- 00:07:46.645 [2024-11-29 05:32:57.847094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.645 [2024-11-29 05:32:57.847120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.645 [2024-11-29 05:32:57.847194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.645 [2024-11-29 05:32:57.847210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.645 [2024-11-29 05:32:57.847265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.645 [2024-11-29 05:32:57.847280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.645 [2024-11-29 05:32:57.847335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:3de4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.645 [2024-11-29 05:32:57.847348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.645 #48 NEW cov: 11832 ft: 14669 corp: 18/489b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 InsertByte- 00:07:46.645 [2024-11-29 05:32:57.887035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:1c1b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.645 [2024-11-29 05:32:57.887061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.645 [2024-11-29 05:32:57.887135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1b1b1b1b cdw11:1be40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.645 [2024-11-29 05:32:57.887150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.645 [2024-11-29 05:32:57.887203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.645 [2024-11-29 05:32:57.887217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.645 #49 NEW cov: 11832 ft: 14714 corp: 19/516b lim: 35 exec/s: 49 rss: 69Mb L: 27/34 MS: 1 ChangeBinInt- 00:07:46.645 [2024-11-29 05:32:57.927301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.645 [2024-11-29 05:32:57.927326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.645 [2024-11-29 05:32:57.927385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.645 [2024-11-29 05:32:57.927398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.645 [2024-11-29 05:32:57.927452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e400e4e4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.645 [2024-11-29 05:32:57.927465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.645 [2024-11-29 05:32:57.927516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00220000 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.645 [2024-11-29 05:32:57.927529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.905 #50 NEW cov: 11832 ft: 14726 corp: 20/550b lim: 35 exec/s: 50 rss: 69Mb L: 34/34 MS: 1 ChangeBinInt- 00:07:46.905 [2024-11-29 05:32:57.967459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e5e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:57.967485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:57.967541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:57.967554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:57.967613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:57.967627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:57.967682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e49be4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:57.967696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.905 #51 NEW cov: 11832 ft: 14730 corp: 21/583b lim: 35 exec/s: 51 rss: 69Mb L: 33/34 MS: 1 ChangeByte- 00:07:46.905 [2024-11-29 05:32:58.007513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.007538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.007595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.007614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.007668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.007681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.007734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.007748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.905 #52 NEW cov: 11832 ft: 14746 corp: 22/617b lim: 35 exec/s: 52 rss: 69Mb L: 34/34 MS: 1 ChangeASCIIInt- 00:07:46.905 [2024-11-29 05:32:58.047653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e5e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.047679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.047733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.047747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.047802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.047815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.047867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.047880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.905 #53 NEW cov: 11832 ft: 14838 corp: 23/651b lim: 35 exec/s: 53 rss: 69Mb L: 34/34 MS: 1 ShuffleBytes- 00:07:46.905 [2024-11-29 05:32:58.087785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e5e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.087810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.087865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.087878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.087933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.087947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.088002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4dfe4e4 cdw11:e4e40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.088016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.905 #54 NEW cov: 11832 ft: 14846 corp: 24/685b lim: 35 exec/s: 54 rss: 69Mb L: 34/34 MS: 1 ChangeBinInt- 00:07:46.905 [2024-11-29 05:32:58.127421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b2b2b2b2 cdw11:b2b20001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.127446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.905 #55 NEW cov: 11832 ft: 14867 corp: 25/696b lim: 35 exec/s: 55 rss: 69Mb L: 11/34 MS: 1 ChangeByte- 00:07:46.905 [2024-11-29 05:32:58.168038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.168064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.168119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.168132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.168185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.168199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.905 [2024-11-29 05:32:58.168254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e5 cdw11:e4e40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.905 [2024-11-29 05:32:58.168267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.905 #56 NEW cov: 11832 ft: 15015 corp: 26/727b lim: 35 exec/s: 56 rss: 69Mb L: 31/34 MS: 1 ChangeBit- 00:07:47.165 [2024-11-29 05:32:58.208029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.165 [2024-11-29 05:32:58.208055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.165 [2024-11-29 05:32:58.208111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4aa0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.165 [2024-11-29 05:32:58.208125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.165 [2024-11-29 05:32:58.208180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e425e4e4 cdw11:1be40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.165 [2024-11-29 05:32:58.208193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.165 #57 NEW cov: 11832 ft: 15079 corp: 27/748b lim: 35 exec/s: 57 rss: 69Mb L: 21/34 MS: 1 ChangeByte- 00:07:47.165 [2024-11-29 05:32:58.247820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b2b2b2b2 cdw11:b2b20001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.165 [2024-11-29 05:32:58.247844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.165 #58 NEW cov: 11832 ft: 15080 corp: 28/760b lim: 35 exec/s: 58 rss: 69Mb L: 12/34 MS: 1 InsertByte- 00:07:47.165 [2024-11-29 05:32:58.288347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.165 [2024-11-29 05:32:58.288372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.165 [2024-11-29 05:32:58.288427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.165 [2024-11-29 05:32:58.288441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.288495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.288509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.288564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e5 cdw11:e4e40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.288578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.166 #59 NEW cov: 11832 ft: 15101 corp: 29/791b lim: 35 exec/s: 59 rss: 69Mb L: 31/34 MS: 1 ShuffleBytes- 00:07:47.166 [2024-11-29 05:32:58.328516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.328544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.328606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.328620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.328674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e0e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.328688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.328740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.328753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.166 #60 NEW cov: 11832 ft: 15111 corp: 30/824b lim: 35 exec/s: 60 rss: 69Mb L: 33/34 MS: 1 ChangeBit- 00:07:47.166 [2024-11-29 05:32:58.368594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e5e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.368623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.368682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.368696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.368751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.368764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.368818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4dfe4e4 cdw11:e4e40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.368831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.166 #61 NEW cov: 11832 ft: 15113 corp: 31/858b lim: 35 exec/s: 61 rss: 70Mb L: 34/34 MS: 1 ChangeByte- 00:07:47.166 [2024-11-29 05:32:58.408718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e5e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.408743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.408797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.408810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.408861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.408875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.408930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.408943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.166 #62 NEW cov: 11832 ft: 15140 corp: 32/892b lim: 35 exec/s: 62 rss: 70Mb L: 34/34 MS: 1 ShuffleBytes- 00:07:47.166 [2024-11-29 05:32:58.448824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e5e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.448849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.448904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.448918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.448970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.448983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.166 [2024-11-29 05:32:58.449035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4dfe4e4 cdw11:e4e40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.166 [2024-11-29 05:32:58.449048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.426 #63 NEW cov: 11832 ft: 15180 corp: 33/926b lim: 35 exec/s: 63 rss: 70Mb L: 34/34 MS: 1 CopyPart- 00:07:47.426 [2024-11-29 05:32:58.488944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.488969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.489023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.489037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.489089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e461 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.489102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.489157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.489170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.426 #64 NEW cov: 11832 ft: 15188 corp: 34/959b lim: 35 exec/s: 64 rss: 70Mb L: 33/34 MS: 1 ChangeByte- 00:07:47.426 [2024-11-29 05:32:58.519058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e5e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.519082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.519136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.519150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.519203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.519216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.519273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.519286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.426 #65 NEW cov: 11832 ft: 15191 corp: 35/992b lim: 35 exec/s: 65 rss: 70Mb L: 33/34 MS: 1 ShuffleBytes- 00:07:47.426 [2024-11-29 05:32:58.559179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.559203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.559261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.559275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.559330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.559344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.559398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4330003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.559411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.426 #66 NEW cov: 11832 ft: 15197 corp: 36/1020b lim: 35 exec/s: 66 rss: 70Mb L: 28/34 MS: 1 EraseBytes- 00:07:47.426 [2024-11-29 05:32:58.599294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.599319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.599373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.599387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.599443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e461 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.599456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.599511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.599524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.426 #67 NEW cov: 11832 ft: 15206 corp: 37/1053b lim: 35 exec/s: 67 rss: 70Mb L: 33/34 MS: 1 ChangeByte- 00:07:47.426 [2024-11-29 05:32:58.639214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.639239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.639294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.639308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.426 [2024-11-29 05:32:58.639365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:aae40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.639378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.426 #68 NEW cov: 11832 ft: 15217 corp: 38/1074b lim: 35 exec/s: 68 rss: 70Mb L: 21/34 MS: 1 CopyPart- 00:07:47.426 [2024-11-29 05:32:58.679010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b2b2b2b2 cdw11:b2b20001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.426 [2024-11-29 05:32:58.679035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.426 #69 NEW cov: 11832 ft: 15225 corp: 39/1085b lim: 35 exec/s: 69 rss: 70Mb L: 11/34 MS: 1 CopyPart- 00:07:47.426 [2024-11-29 05:32:58.719635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e5e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.427 [2024-11-29 05:32:58.719661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.427 [2024-11-29 05:32:58.719720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.427 [2024-11-29 05:32:58.719735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.427 [2024-11-29 05:32:58.719792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.427 [2024-11-29 05:32:58.719805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.427 [2024-11-29 05:32:58.719862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.427 [2024-11-29 05:32:58.719876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.686 #70 NEW cov: 11832 ft: 15235 corp: 40/1118b lim: 35 exec/s: 70 rss: 70Mb L: 33/34 MS: 1 ShuffleBytes- 00:07:47.686 [2024-11-29 05:32:58.759761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.686 [2024-11-29 05:32:58.759787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.686 [2024-11-29 05:32:58.759843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.686 [2024-11-29 05:32:58.759857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.686 [2024-11-29 05:32:58.759910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e461 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.686 [2024-11-29 05:32:58.759923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.686 [2024-11-29 05:32:58.759977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e425e4e4 cdw11:1ba40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.686 [2024-11-29 05:32:58.759990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.686 #71 NEW cov: 11832 ft: 15271 corp: 41/1146b lim: 35 exec/s: 71 rss: 70Mb L: 28/34 MS: 1 EraseBytes- 00:07:47.686 [2024-11-29 05:32:58.799856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:dae44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.686 [2024-11-29 05:32:58.799884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.686 [2024-11-29 05:32:58.799942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.686 [2024-11-29 05:32:58.799955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.686 [2024-11-29 05:32:58.800010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.686 [2024-11-29 05:32:58.800024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.686 [2024-11-29 05:32:58.800079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e49be4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.686 [2024-11-29 05:32:58.800092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.686 #72 NEW cov: 11832 ft: 15311 corp: 42/1179b lim: 35 exec/s: 72 rss: 70Mb L: 33/34 MS: 1 ChangeByte- 00:07:47.686 [2024-11-29 05:32:58.839949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e5e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.686 [2024-11-29 05:32:58.839974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.686 [2024-11-29 05:32:58.840029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.686 [2024-11-29 05:32:58.840042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.686 [2024-11-29 05:32:58.840096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.686 [2024-11-29 05:32:58.840109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.686 [2024-11-29 05:32:58.840165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.687 [2024-11-29 05:32:58.840179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.687 #73 NEW cov: 11832 ft: 15313 corp: 43/1212b lim: 35 exec/s: 73 rss: 70Mb L: 33/34 MS: 1 ChangeBit- 00:07:47.687 [2024-11-29 05:32:58.880080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e4e44ae4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.687 [2024-11-29 05:32:58.880104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.687 [2024-11-29 05:32:58.880157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.687 [2024-11-29 05:32:58.880171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.687 [2024-11-29 05:32:58.880224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.687 [2024-11-29 05:32:58.880237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.687 [2024-11-29 05:32:58.880290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2d2d2d2d cdw11:2d2d0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.687 [2024-11-29 05:32:58.880303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.687 #74 NEW cov: 11832 ft: 15316 corp: 44/1240b lim: 35 exec/s: 37 rss: 70Mb L: 28/34 MS: 1 InsertByte- 00:07:47.687 #74 DONE cov: 11832 ft: 15316 corp: 44/1240b lim: 35 exec/s: 37 rss: 70Mb 00:07:47.687 Done 74 runs in 2 second(s) 00:07:47.947 05:32:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:47.947 05:32:59 -- ../common.sh@72 -- # (( i++ )) 00:07:47.947 05:32:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.947 05:32:59 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:47.947 05:32:59 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:47.947 05:32:59 -- nvmf/run.sh@24 -- # local timen=1 00:07:47.947 05:32:59 -- nvmf/run.sh@25 -- # local core=0x1 00:07:47.947 05:32:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:47.947 05:32:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:47.947 05:32:59 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:47.947 05:32:59 -- nvmf/run.sh@29 -- # port=4405 00:07:47.947 05:32:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:47.947 05:32:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:47.947 05:32:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:47.947 05:32:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:47.947 [2024-11-29 05:32:59.060068] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:47.947 [2024-11-29 05:32:59.060162] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2215118 ] 00:07:47.947 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.207 [2024-11-29 05:32:59.313757] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.207 [2024-11-29 05:32:59.342398] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:48.207 [2024-11-29 05:32:59.342540] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.207 [2024-11-29 05:32:59.394176] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:48.207 [2024-11-29 05:32:59.410540] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:48.207 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.207 INFO: Seed: 2804263352 00:07:48.207 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:48.207 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:48.207 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:48.207 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.207 #2 INITED exec/s: 0 rss: 60Mb 00:07:48.207 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.207 This may also happen if the target rejected all inputs we tried so far 00:07:48.207 [2024-11-29 05:32:59.487971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.207 [2024-11-29 05:32:59.488011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.207 [2024-11-29 05:32:59.488094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.207 [2024-11-29 05:32:59.488109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.728 NEW_FUNC[1/671]: 0x4596b8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:48.728 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.728 #20 NEW cov: 11616 ft: 11617 corp: 2/23b lim: 45 exec/s: 0 rss: 67Mb L: 22/22 MS: 3 InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:48.728 [2024-11-29 05:32:59.808172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.728 [2024-11-29 05:32:59.808221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.728 [2024-11-29 05:32:59.808353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.728 [2024-11-29 05:32:59.808376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.728 #21 NEW cov: 11729 ft: 12249 corp: 3/45b lim: 45 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 ChangeBit- 00:07:48.728 [2024-11-29 05:32:59.868368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.728 [2024-11-29 05:32:59.868397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.728 [2024-11-29 05:32:59.868530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0606fd05 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.728 [2024-11-29 05:32:59.868548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.728 #32 NEW cov: 11735 ft: 12496 corp: 4/67b lim: 45 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 ChangeBinInt- 00:07:48.728 [2024-11-29 05:32:59.928763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.728 [2024-11-29 05:32:59.928792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.728 [2024-11-29 05:32:59.928915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.728 [2024-11-29 05:32:59.928932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.728 [2024-11-29 05:32:59.929052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.728 [2024-11-29 05:32:59.929070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.728 #33 NEW cov: 11820 ft: 13101 corp: 5/99b lim: 45 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:48.728 [2024-11-29 05:32:59.978212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.728 [2024-11-29 05:32:59.978240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.728 #34 NEW cov: 11820 ft: 13933 corp: 6/114b lim: 45 exec/s: 0 rss: 67Mb L: 15/32 MS: 1 EraseBytes- 00:07:48.986 [2024-11-29 05:33:00.038530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:3c0a3cbc cdw11:0abc0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-29 05:33:00.038562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.986 #39 NEW cov: 11820 ft: 14042 corp: 7/123b lim: 45 exec/s: 0 rss: 67Mb L: 9/32 MS: 5 CopyPart-ChangeByte-CopyPart-InsertByte-CopyPart- 00:07:48.986 [2024-11-29 05:33:00.099649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-29 05:33:00.099692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.986 [2024-11-29 05:33:00.099822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-29 05:33:00.099841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.986 [2024-11-29 05:33:00.099975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-29 05:33:00.099991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.986 [2024-11-29 05:33:00.100113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-29 05:33:00.100132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.986 #40 NEW cov: 11820 ft: 14438 corp: 8/163b lim: 45 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 CopyPart- 00:07:48.986 [2024-11-29 05:33:00.149118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-29 05:33:00.149147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.986 [2024-11-29 05:33:00.149266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-29 05:33:00.149283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.986 #41 NEW cov: 11820 ft: 14603 corp: 9/185b lim: 45 exec/s: 0 rss: 67Mb L: 22/40 MS: 1 ShuffleBytes- 00:07:48.986 [2024-11-29 05:33:00.199295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-29 05:33:00.199324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.986 [2024-11-29 05:33:00.199449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-29 05:33:00.199468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.986 #42 NEW cov: 11820 ft: 14642 corp: 10/205b lim: 45 exec/s: 0 rss: 68Mb L: 20/40 MS: 1 EraseBytes- 00:07:48.986 [2024-11-29 05:33:00.249284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-29 05:33:00.249314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.986 #44 NEW cov: 11820 ft: 14653 corp: 11/219b lim: 45 exec/s: 0 rss: 68Mb L: 14/40 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:49.243 [2024-11-29 05:33:00.300022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.243 [2024-11-29 05:33:00.300057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.243 [2024-11-29 05:33:00.300173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.243 [2024-11-29 05:33:00.300190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.243 [2024-11-29 05:33:00.300331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:06000606 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.243 [2024-11-29 05:33:00.300350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.243 #45 NEW cov: 11820 ft: 14665 corp: 12/251b lim: 45 exec/s: 0 rss: 68Mb L: 32/40 MS: 1 CrossOver- 00:07:49.243 [2024-11-29 05:33:00.359820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06ab0606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.243 [2024-11-29 05:33:00.359849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.243 [2024-11-29 05:33:00.359972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:050606fd cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-29 05:33:00.360016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.244 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.244 #46 NEW cov: 11843 ft: 14719 corp: 13/274b lim: 45 exec/s: 0 rss: 68Mb L: 23/40 MS: 1 InsertByte- 00:07:49.244 [2024-11-29 05:33:00.409706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:3c0a3cbc cdw11:0abc0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-29 05:33:00.409733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.244 #52 NEW cov: 11843 ft: 14743 corp: 14/283b lim: 45 exec/s: 0 rss: 68Mb L: 9/40 MS: 1 ChangeByte- 00:07:49.244 [2024-11-29 05:33:00.470805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-29 05:33:00.470834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.244 [2024-11-29 05:33:00.470958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-29 05:33:00.470975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.244 [2024-11-29 05:33:00.471092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-29 05:33:00.471109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.244 [2024-11-29 05:33:00.471234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-29 05:33:00.471253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.244 #53 NEW cov: 11843 ft: 14762 corp: 15/323b lim: 45 exec/s: 53 rss: 68Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:49.244 [2024-11-29 05:33:00.530811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-29 05:33:00.530838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.244 [2024-11-29 05:33:00.530984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-29 05:33:00.531002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.244 [2024-11-29 05:33:00.531136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000606 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-29 05:33:00.531158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.548 #54 NEW cov: 11843 ft: 14771 corp: 16/352b lim: 45 exec/s: 54 rss: 68Mb L: 29/40 MS: 1 InsertRepeatedBytes- 00:07:49.548 [2024-11-29 05:33:00.580916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.548 [2024-11-29 05:33:00.580946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.548 [2024-11-29 05:33:00.581081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.548 [2024-11-29 05:33:00.581099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.548 [2024-11-29 05:33:00.581231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.549 [2024-11-29 05:33:00.581248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.549 #55 NEW cov: 11843 ft: 14790 corp: 17/384b lim: 45 exec/s: 55 rss: 68Mb L: 32/40 MS: 1 CopyPart- 00:07:49.549 [2024-11-29 05:33:00.641046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.549 [2024-11-29 05:33:00.641075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.549 [2024-11-29 05:33:00.641197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.549 [2024-11-29 05:33:00.641216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.549 [2024-11-29 05:33:00.641340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.549 [2024-11-29 05:33:00.641356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.549 #56 NEW cov: 11843 ft: 14825 corp: 18/416b lim: 45 exec/s: 56 rss: 68Mb L: 32/40 MS: 1 ChangeBit- 00:07:49.549 [2024-11-29 05:33:00.690935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.549 [2024-11-29 05:33:00.690963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.549 [2024-11-29 05:33:00.691078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.549 [2024-11-29 05:33:00.691094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.549 #57 NEW cov: 11843 ft: 14846 corp: 19/438b lim: 45 exec/s: 57 rss: 68Mb L: 22/40 MS: 1 ChangeBit- 00:07:49.549 [2024-11-29 05:33:00.741078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.549 [2024-11-29 05:33:00.741107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.549 [2024-11-29 05:33:00.741229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0506fd06 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.549 [2024-11-29 05:33:00.741249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.549 #58 NEW cov: 11843 ft: 14867 corp: 20/460b lim: 45 exec/s: 58 rss: 68Mb L: 22/40 MS: 1 ShuffleBytes- 00:07:49.549 [2024-11-29 05:33:00.791250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.549 [2024-11-29 05:33:00.791280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.549 [2024-11-29 05:33:00.791412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0706fd06 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.549 [2024-11-29 05:33:00.791431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.549 #59 NEW cov: 11843 ft: 14886 corp: 21/482b lim: 45 exec/s: 59 rss: 68Mb L: 22/40 MS: 1 ChangeBit- 00:07:49.815 [2024-11-29 05:33:00.852014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.815 [2024-11-29 05:33:00.852045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.815 [2024-11-29 05:33:00.852163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.815 [2024-11-29 05:33:00.852180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.815 [2024-11-29 05:33:00.852301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.815 [2024-11-29 05:33:00.852319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.815 [2024-11-29 05:33:00.852438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.815 [2024-11-29 05:33:00.852456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.815 #60 NEW cov: 11843 ft: 14901 corp: 22/524b lim: 45 exec/s: 60 rss: 68Mb L: 42/42 MS: 1 CrossOver- 00:07:49.815 [2024-11-29 05:33:00.911604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.815 [2024-11-29 05:33:00.911637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.815 [2024-11-29 05:33:00.911765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0706fd06 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.815 [2024-11-29 05:33:00.911785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.815 #61 NEW cov: 11843 ft: 14909 corp: 23/546b lim: 45 exec/s: 61 rss: 68Mb L: 22/42 MS: 1 ShuffleBytes- 00:07:49.815 [2024-11-29 05:33:00.971786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.815 [2024-11-29 05:33:00.971817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.815 [2024-11-29 05:33:00.971941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0506fd06 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.815 [2024-11-29 05:33:00.971959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.815 #62 NEW cov: 11843 ft: 14923 corp: 24/568b lim: 45 exec/s: 62 rss: 68Mb L: 22/42 MS: 1 ChangeBit- 00:07:49.816 [2024-11-29 05:33:01.022524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.816 [2024-11-29 05:33:01.022559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.816 [2024-11-29 05:33:01.022680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.816 [2024-11-29 05:33:01.022700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.816 [2024-11-29 05:33:01.022826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:06060606 cdw11:97970004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.816 [2024-11-29 05:33:01.022843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.816 [2024-11-29 05:33:01.022968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:06009706 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.816 [2024-11-29 05:33:01.022986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.816 #63 NEW cov: 11843 ft: 14945 corp: 25/606b lim: 45 exec/s: 63 rss: 69Mb L: 38/42 MS: 1 InsertRepeatedBytes- 00:07:49.816 [2024-11-29 05:33:01.082150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.816 [2024-11-29 05:33:01.082179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.816 [2024-11-29 05:33:01.082299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:97979797 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.816 [2024-11-29 05:33:01.082315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.816 #64 NEW cov: 11843 ft: 14954 corp: 26/629b lim: 45 exec/s: 64 rss: 69Mb L: 23/42 MS: 1 EraseBytes- 00:07:50.090 [2024-11-29 05:33:01.142361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:3c0a3cbc cdw11:3cbc0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.090 [2024-11-29 05:33:01.142391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.090 [2024-11-29 05:33:01.142518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0a0abc00 cdw11:0abc0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.090 [2024-11-29 05:33:01.142538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.090 #65 NEW cov: 11843 ft: 14969 corp: 27/647b lim: 45 exec/s: 65 rss: 69Mb L: 18/42 MS: 1 CopyPart- 00:07:50.090 [2024-11-29 05:33:01.203015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.090 [2024-11-29 05:33:01.203044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.090 [2024-11-29 05:33:01.203184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:fc060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.090 [2024-11-29 05:33:01.203200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.090 [2024-11-29 05:33:01.203328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.090 [2024-11-29 05:33:01.203345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.090 [2024-11-29 05:33:01.203477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.090 [2024-11-29 05:33:01.203498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.090 #66 NEW cov: 11843 ft: 15003 corp: 28/687b lim: 45 exec/s: 66 rss: 69Mb L: 40/42 MS: 1 ChangeByte- 00:07:50.090 [2024-11-29 05:33:01.252641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06ab0606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.090 [2024-11-29 05:33:01.252668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.090 [2024-11-29 05:33:01.252802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:050606fd cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.090 [2024-11-29 05:33:01.252821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.090 #67 NEW cov: 11843 ft: 15017 corp: 29/710b lim: 45 exec/s: 67 rss: 69Mb L: 23/42 MS: 1 CopyPart- 00:07:50.090 [2024-11-29 05:33:01.313170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.090 [2024-11-29 05:33:01.313198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.090 [2024-11-29 05:33:01.313327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060604 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.090 [2024-11-29 05:33:01.313345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.090 [2024-11-29 05:33:01.313461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.090 [2024-11-29 05:33:01.313480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.090 #68 NEW cov: 11843 ft: 15087 corp: 30/742b lim: 45 exec/s: 68 rss: 69Mb L: 32/42 MS: 1 ChangeBit- 00:07:50.090 [2024-11-29 05:33:01.363079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.091 [2024-11-29 05:33:01.363108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.091 [2024-11-29 05:33:01.363225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.091 [2024-11-29 05:33:01.363245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.091 #69 NEW cov: 11843 ft: 15107 corp: 31/764b lim: 45 exec/s: 69 rss: 69Mb L: 22/42 MS: 1 CMP- DE: "v\000\000\000"- 00:07:50.357 [2024-11-29 05:33:01.413168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:06060606 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.357 [2024-11-29 05:33:01.413195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.357 [2024-11-29 05:33:01.413325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0706fd06 cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.357 [2024-11-29 05:33:01.413343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.357 #70 NEW cov: 11843 ft: 15118 corp: 32/786b lim: 45 exec/s: 70 rss: 69Mb L: 22/42 MS: 1 ChangeBit- 00:07:50.357 [2024-11-29 05:33:01.463271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:f1ab06fa cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.357 [2024-11-29 05:33:01.463300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.357 [2024-11-29 05:33:01.463434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:050606fd cdw11:06060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.357 [2024-11-29 05:33:01.463452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.357 #71 NEW cov: 11843 ft: 15126 corp: 33/809b lim: 45 exec/s: 35 rss: 69Mb L: 23/42 MS: 1 ChangeBinInt- 00:07:50.357 #71 DONE cov: 11843 ft: 15126 corp: 33/809b lim: 45 exec/s: 35 rss: 69Mb 00:07:50.357 ###### Recommended dictionary. ###### 00:07:50.357 "v\000\000\000" # Uses: 0 00:07:50.357 ###### End of recommended dictionary. ###### 00:07:50.357 Done 71 runs in 2 second(s) 00:07:50.357 05:33:01 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:50.357 05:33:01 -- ../common.sh@72 -- # (( i++ )) 00:07:50.357 05:33:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.357 05:33:01 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:50.357 05:33:01 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:50.357 05:33:01 -- nvmf/run.sh@24 -- # local timen=1 00:07:50.357 05:33:01 -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.357 05:33:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:50.357 05:33:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:50.357 05:33:01 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:50.357 05:33:01 -- nvmf/run.sh@29 -- # port=4406 00:07:50.357 05:33:01 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:50.357 05:33:01 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:50.357 05:33:01 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.357 05:33:01 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:50.357 [2024-11-29 05:33:01.648033] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:50.357 [2024-11-29 05:33:01.648119] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2215782 ] 00:07:50.616 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.616 [2024-11-29 05:33:01.902942] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.874 [2024-11-29 05:33:01.931920] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:50.874 [2024-11-29 05:33:01.932040] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.874 [2024-11-29 05:33:01.983264] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.874 [2024-11-29 05:33:01.999635] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:50.874 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.874 INFO: Seed: 1097311295 00:07:50.874 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:50.874 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:50.874 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:50.874 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.874 #2 INITED exec/s: 0 rss: 59Mb 00:07:50.874 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.874 This may also happen if the target rejected all inputs we tried so far 00:07:50.874 [2024-11-29 05:33:02.066735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:50.874 [2024-11-29 05:33:02.066773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.874 [2024-11-29 05:33:02.066844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:50.874 [2024-11-29 05:33:02.066858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.874 [2024-11-29 05:33:02.066927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000640a cdw11:00000000 00:07:50.874 [2024-11-29 05:33:02.066941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.132 NEW_FUNC[1/669]: 0x45bec8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:51.132 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:51.132 #3 NEW cov: 11533 ft: 11534 corp: 2/7b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:51.132 [2024-11-29 05:33:02.397180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.132 [2024-11-29 05:33:02.397232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.132 [2024-11-29 05:33:02.397369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.132 [2024-11-29 05:33:02.397394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.132 [2024-11-29 05:33:02.397520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006403 cdw11:00000000 00:07:51.132 [2024-11-29 05:33:02.397545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.133 #4 NEW cov: 11646 ft: 12058 corp: 3/13b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 ChangeByte- 00:07:51.391 [2024-11-29 05:33:02.447092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.447122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.447241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.447259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.447374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000640a cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.447392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.391 #5 NEW cov: 11652 ft: 12452 corp: 4/20b lim: 10 exec/s: 0 rss: 67Mb L: 7/7 MS: 1 InsertByte- 00:07:51.391 [2024-11-29 05:33:02.487236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.487266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.487386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.487403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.487513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a50a cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.487529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.391 #6 NEW cov: 11737 ft: 12695 corp: 5/26b lim: 10 exec/s: 0 rss: 67Mb L: 6/7 MS: 1 ChangeBinInt- 00:07:51.391 [2024-11-29 05:33:02.527274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.527303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.527433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.527449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.527561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a55d cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.527579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.391 #7 NEW cov: 11737 ft: 12773 corp: 6/32b lim: 10 exec/s: 0 rss: 67Mb L: 6/7 MS: 1 ChangeByte- 00:07:51.391 [2024-11-29 05:33:02.577408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.577440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.577571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.577589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.577701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006e0a cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.577717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.391 #13 NEW cov: 11737 ft: 12817 corp: 7/38b lim: 10 exec/s: 0 rss: 67Mb L: 6/7 MS: 1 ChangeByte- 00:07:51.391 [2024-11-29 05:33:02.617578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.617612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.617721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006430 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.617738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.617854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a50a cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.617871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.391 #14 NEW cov: 11737 ft: 12927 corp: 8/44b lim: 10 exec/s: 0 rss: 67Mb L: 6/7 MS: 1 ChangeByte- 00:07:51.391 [2024-11-29 05:33:02.657663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.657692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.657816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.657833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.391 [2024-11-29 05:33:02.657951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000640a cdw11:00000000 00:07:51.391 [2024-11-29 05:33:02.657967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.391 #15 NEW cov: 11737 ft: 12948 corp: 9/50b lim: 10 exec/s: 0 rss: 67Mb L: 6/7 MS: 1 ShuffleBytes- 00:07:51.650 [2024-11-29 05:33:02.697652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a64 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.697680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.697809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.697826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.650 #16 NEW cov: 11737 ft: 13217 corp: 10/54b lim: 10 exec/s: 0 rss: 67Mb L: 4/7 MS: 1 CrossOver- 00:07:51.650 [2024-11-29 05:33:02.737940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a564 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.737968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.738087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000645d cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.738106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.738226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.738243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.650 #17 NEW cov: 11737 ft: 13248 corp: 11/60b lim: 10 exec/s: 0 rss: 67Mb L: 6/7 MS: 1 ShuffleBytes- 00:07:51.650 [2024-11-29 05:33:02.778132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.778160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.778287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.778304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.778415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006403 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.778431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.650 #18 NEW cov: 11737 ft: 13290 corp: 12/66b lim: 10 exec/s: 0 rss: 67Mb L: 6/7 MS: 1 CopyPart- 00:07:51.650 [2024-11-29 05:33:02.818539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.818568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.818701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.818719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.818832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000606 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.818866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.818981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000606 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.818998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.819105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000a55d cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.819125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.650 #19 NEW cov: 11737 ft: 13556 corp: 13/76b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:51.650 [2024-11-29 05:33:02.858329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008564 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.858357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.858483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000645d cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.858500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.858617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.858633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.650 #20 NEW cov: 11737 ft: 13575 corp: 14/82b lim: 10 exec/s: 0 rss: 67Mb L: 6/10 MS: 1 ChangeBit- 00:07:51.650 [2024-11-29 05:33:02.908900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.908927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.909047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.909062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.909187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a500 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.909205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.909321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.909338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.650 [2024-11-29 05:33:02.909448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000005d cdw11:00000000 00:07:51.650 [2024-11-29 05:33:02.909463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.651 #21 NEW cov: 11737 ft: 13648 corp: 15/92b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:51.651 [2024-11-29 05:33:02.948533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005164 cdw11:00000000 00:07:51.651 [2024-11-29 05:33:02.948560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.651 [2024-11-29 05:33:02.948671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.651 [2024-11-29 05:33:02.948689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.651 [2024-11-29 05:33:02.948793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.651 [2024-11-29 05:33:02.948810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.909 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:51.909 #22 NEW cov: 11760 ft: 13747 corp: 16/99b lim: 10 exec/s: 0 rss: 67Mb L: 7/10 MS: 1 InsertByte- 00:07:51.909 [2024-11-29 05:33:02.988710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:02.988737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.909 [2024-11-29 05:33:02.988857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:02.988874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.909 [2024-11-29 05:33:02.988983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006403 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:02.988999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.909 #23 NEW cov: 11760 ft: 13780 corp: 17/105b lim: 10 exec/s: 0 rss: 68Mb L: 6/10 MS: 1 CMP- DE: "\000\000"- 00:07:51.909 [2024-11-29 05:33:03.028990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:03.029016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.909 [2024-11-29 05:33:03.029132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:03.029150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.909 [2024-11-29 05:33:03.029255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a500 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:03.029271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.909 [2024-11-29 05:33:03.029374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:51.909 [2024-11-29 05:33:03.029390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.909 #24 NEW cov: 11760 ft: 13791 corp: 18/113b lim: 10 exec/s: 24 rss: 68Mb L: 8/10 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:51.909 [2024-11-29 05:33:03.068422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a00 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:03.068449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.909 #26 NEW cov: 11760 ft: 14000 corp: 19/116b lim: 10 exec/s: 26 rss: 68Mb L: 3/10 MS: 2 ChangeByte-PersAutoDict- DE: "\000\000"- 00:07:51.909 [2024-11-29 05:33:03.108594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006c28 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:03.108625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.909 #29 NEW cov: 11760 ft: 14020 corp: 20/118b lim: 10 exec/s: 29 rss: 68Mb L: 2/10 MS: 3 CrossOver-ChangeBit-InsertByte- 00:07:51.909 [2024-11-29 05:33:03.149224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:03.149251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.909 [2024-11-29 05:33:03.149375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006664 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:03.149391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.909 [2024-11-29 05:33:03.149506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006e0a cdw11:00000000 00:07:51.909 [2024-11-29 05:33:03.149522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.909 #30 NEW cov: 11760 ft: 14059 corp: 21/124b lim: 10 exec/s: 30 rss: 68Mb L: 6/10 MS: 1 ChangeBit- 00:07:51.909 [2024-11-29 05:33:03.189126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:03.189153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.909 [2024-11-29 05:33:03.189278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:51.909 [2024-11-29 05:33:03.189293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.168 #31 NEW cov: 11760 ft: 14078 corp: 22/129b lim: 10 exec/s: 31 rss: 68Mb L: 5/10 MS: 1 EraseBytes- 00:07:52.168 [2024-11-29 05:33:03.229774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:52.168 [2024-11-29 05:33:03.229803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.168 [2024-11-29 05:33:03.229918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.168 [2024-11-29 05:33:03.229935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.168 [2024-11-29 05:33:03.230019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.168 [2024-11-29 05:33:03.230036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.168 [2024-11-29 05:33:03.230146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.168 [2024-11-29 05:33:03.230162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.168 [2024-11-29 05:33:03.230278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.168 [2024-11-29 05:33:03.230294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.168 #32 NEW cov: 11760 ft: 14095 corp: 23/139b lim: 10 exec/s: 32 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:52.168 [2024-11-29 05:33:03.269508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.168 [2024-11-29 05:33:03.269536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.168 [2024-11-29 05:33:03.269660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.168 [2024-11-29 05:33:03.269675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.168 [2024-11-29 05:33:03.269789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000032a5 cdw11:00000000 00:07:52.168 [2024-11-29 05:33:03.269805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.168 #33 NEW cov: 11760 ft: 14110 corp: 24/146b lim: 10 exec/s: 33 rss: 68Mb L: 7/10 MS: 1 InsertByte- 00:07:52.168 [2024-11-29 05:33:03.309683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.168 [2024-11-29 05:33:03.309711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.168 [2024-11-29 05:33:03.309831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.168 [2024-11-29 05:33:03.309849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.168 [2024-11-29 05:33:03.309965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a564 cdw11:00000000 00:07:52.168 [2024-11-29 05:33:03.309982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.168 #34 NEW cov: 11760 ft: 14196 corp: 25/152b lim: 10 exec/s: 34 rss: 68Mb L: 6/10 MS: 1 CopyPart- 00:07:52.168 [2024-11-29 05:33:03.349815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009364 cdw11:00000000 00:07:52.169 [2024-11-29 05:33:03.349845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.169 [2024-11-29 05:33:03.349962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.169 [2024-11-29 05:33:03.349978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.169 [2024-11-29 05:33:03.350097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000030a5 cdw11:00000000 00:07:52.169 [2024-11-29 05:33:03.350115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.169 #35 NEW cov: 11760 ft: 14229 corp: 26/159b lim: 10 exec/s: 35 rss: 68Mb L: 7/10 MS: 1 InsertByte- 00:07:52.169 [2024-11-29 05:33:03.399948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.169 [2024-11-29 05:33:03.399977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.169 [2024-11-29 05:33:03.400091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.169 [2024-11-29 05:33:03.400109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.169 [2024-11-29 05:33:03.400225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a564 cdw11:00000000 00:07:52.169 [2024-11-29 05:33:03.400241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.169 #36 NEW cov: 11760 ft: 14236 corp: 27/166b lim: 10 exec/s: 36 rss: 68Mb L: 7/10 MS: 1 InsertByte- 00:07:52.169 [2024-11-29 05:33:03.450093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009364 cdw11:00000000 00:07:52.169 [2024-11-29 05:33:03.450122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.169 [2024-11-29 05:33:03.450241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.169 [2024-11-29 05:33:03.450257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.169 [2024-11-29 05:33:03.450380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000030ff cdw11:00000000 00:07:52.169 [2024-11-29 05:33:03.450396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.428 #37 NEW cov: 11760 ft: 14254 corp: 28/173b lim: 10 exec/s: 37 rss: 68Mb L: 7/10 MS: 1 CrossOver- 00:07:52.428 [2024-11-29 05:33:03.490099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.490125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.490241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000064a5 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.490257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.428 #38 NEW cov: 11760 ft: 14274 corp: 29/178b lim: 10 exec/s: 38 rss: 68Mb L: 5/10 MS: 1 EraseBytes- 00:07:52.428 [2024-11-29 05:33:03.540601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.540629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.540763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006466 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.540780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.540890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a500 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.540906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.541021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.541037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.428 #39 NEW cov: 11760 ft: 14325 corp: 30/186b lim: 10 exec/s: 39 rss: 68Mb L: 8/10 MS: 1 ChangeBinInt- 00:07:52.428 [2024-11-29 05:33:03.580949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.580976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.581095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.581113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.581234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.581251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.581369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.581385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.581503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000a50a cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.581520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.428 #40 NEW cov: 11760 ft: 14353 corp: 31/196b lim: 10 exec/s: 40 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:52.428 [2024-11-29 05:33:03.620207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.620234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.428 #43 NEW cov: 11760 ft: 14367 corp: 32/199b lim: 10 exec/s: 43 rss: 68Mb L: 3/10 MS: 3 EraseBytes-ChangeByte-PersAutoDict- DE: "\000\000"- 00:07:52.428 [2024-11-29 05:33:03.671213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.671243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.671360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.671382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.671496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000606 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.671513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.671625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000606 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.671641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.671760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000a521 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.671778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.428 #44 NEW cov: 11760 ft: 14372 corp: 33/209b lim: 10 exec/s: 44 rss: 68Mb L: 10/10 MS: 1 ChangeByte- 00:07:52.428 [2024-11-29 05:33:03.710971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.711000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.711110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.711128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.428 [2024-11-29 05:33:03.711239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.428 [2024-11-29 05:33:03.711257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.687 #45 NEW cov: 11760 ft: 14399 corp: 34/216b lim: 10 exec/s: 45 rss: 68Mb L: 7/10 MS: 1 CopyPart- 00:07:52.687 [2024-11-29 05:33:03.751192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.751220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.687 [2024-11-29 05:33:03.751347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.751364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.687 [2024-11-29 05:33:03.751479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006400 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.751495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.687 [2024-11-29 05:33:03.751619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.751637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.687 #46 NEW cov: 11760 ft: 14431 corp: 35/224b lim: 10 exec/s: 46 rss: 68Mb L: 8/10 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:52.687 [2024-11-29 05:33:03.800868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005164 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.800898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.687 [2024-11-29 05:33:03.801015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.801035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.687 [2024-11-29 05:33:03.801159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.801175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.687 #47 NEW cov: 11760 ft: 14454 corp: 36/231b lim: 10 exec/s: 47 rss: 69Mb L: 7/10 MS: 1 ShuffleBytes- 00:07:52.687 [2024-11-29 05:33:03.851152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006431 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.851180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.687 [2024-11-29 05:33:03.851301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000064a5 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.851319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.687 #48 NEW cov: 11760 ft: 14483 corp: 37/236b lim: 10 exec/s: 48 rss: 69Mb L: 5/10 MS: 1 ChangeByte- 00:07:52.687 [2024-11-29 05:33:03.901565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000664 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.901594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.687 [2024-11-29 05:33:03.901720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.901737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.687 [2024-11-29 05:33:03.901853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000640a cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.901872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.687 #49 NEW cov: 11760 ft: 14492 corp: 38/242b lim: 10 exec/s: 49 rss: 69Mb L: 6/10 MS: 1 CrossOver- 00:07:52.687 [2024-11-29 05:33:03.941961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.941990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.687 [2024-11-29 05:33:03.942107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002a64 cdw11:00000000 00:07:52.687 [2024-11-29 05:33:03.942123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.688 [2024-11-29 05:33:03.942227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a500 cdw11:00000000 00:07:52.688 [2024-11-29 05:33:03.942245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.688 [2024-11-29 05:33:03.942348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.688 [2024-11-29 05:33:03.942366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.688 [2024-11-29 05:33:03.942480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000005d cdw11:00000000 00:07:52.688 [2024-11-29 05:33:03.942496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.688 #50 NEW cov: 11760 ft: 14494 corp: 39/252b lim: 10 exec/s: 50 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:52.946 [2024-11-29 05:33:03.991719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005100 cdw11:00000000 00:07:52.946 [2024-11-29 05:33:03.991750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.947 [2024-11-29 05:33:03.991870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000064 cdw11:00000000 00:07:52.947 [2024-11-29 05:33:03.991888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.947 [2024-11-29 05:33:03.992001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.947 [2024-11-29 05:33:03.992018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.947 #51 NEW cov: 11760 ft: 14505 corp: 40/259b lim: 10 exec/s: 51 rss: 69Mb L: 7/10 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:52.947 [2024-11-29 05:33:04.041620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000645b cdw11:00000000 00:07:52.947 [2024-11-29 05:33:04.041649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.947 [2024-11-29 05:33:04.041766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006464 cdw11:00000000 00:07:52.947 [2024-11-29 05:33:04.041784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.947 #52 NEW cov: 11760 ft: 14506 corp: 41/264b lim: 10 exec/s: 26 rss: 69Mb L: 5/10 MS: 1 ChangeBinInt- 00:07:52.947 #52 DONE cov: 11760 ft: 14506 corp: 41/264b lim: 10 exec/s: 26 rss: 69Mb 00:07:52.947 ###### Recommended dictionary. ###### 00:07:52.947 "\000\000" # Uses: 5 00:07:52.947 ###### End of recommended dictionary. ###### 00:07:52.947 Done 52 runs in 2 second(s) 00:07:52.947 05:33:04 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:52.947 05:33:04 -- ../common.sh@72 -- # (( i++ )) 00:07:52.947 05:33:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.947 05:33:04 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:52.947 05:33:04 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:52.947 05:33:04 -- nvmf/run.sh@24 -- # local timen=1 00:07:52.947 05:33:04 -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.947 05:33:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:52.947 05:33:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:52.947 05:33:04 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:52.947 05:33:04 -- nvmf/run.sh@29 -- # port=4407 00:07:52.947 05:33:04 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:52.947 05:33:04 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:52.947 05:33:04 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.947 05:33:04 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:52.947 [2024-11-29 05:33:04.231230] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:52.947 [2024-11-29 05:33:04.231320] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2216489 ] 00:07:53.205 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.206 [2024-11-29 05:33:04.488593] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.464 [2024-11-29 05:33:04.514775] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:53.464 [2024-11-29 05:33:04.514917] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.464 [2024-11-29 05:33:04.566219] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.464 [2024-11-29 05:33:04.582586] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:53.464 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.464 INFO: Seed: 3680289189 00:07:53.464 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:53.464 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:53.464 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:53.464 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.464 #2 INITED exec/s: 0 rss: 59Mb 00:07:53.464 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:53.464 This may also happen if the target rejected all inputs we tried so far 00:07:53.464 [2024-11-29 05:33:04.649363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a34 cdw11:00000000 00:07:53.464 [2024-11-29 05:33:04.649401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.722 NEW_FUNC[1/668]: 0x45c8c8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:53.722 NEW_FUNC[2/668]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.722 #5 NEW cov: 11532 ft: 11515 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 3 ShuffleBytes-CopyPart-InsertByte- 00:07:53.722 [2024-11-29 05:33:04.979619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000600a cdw11:00000000 00:07:53.722 [2024-11-29 05:33:04.979660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.722 NEW_FUNC[1/1]: 0xead8b8 in spdk_get_ticks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:296 00:07:53.722 #6 NEW cov: 11646 ft: 12010 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:53.722 [2024-11-29 05:33:05.019609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a24 cdw11:00000000 00:07:53.722 [2024-11-29 05:33:05.019639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.985 #7 NEW cov: 11652 ft: 12437 corp: 4/7b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:53.985 [2024-11-29 05:33:05.059759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.985 [2024-11-29 05:33:05.059787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.985 #8 NEW cov: 11737 ft: 12759 corp: 5/9b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:53.985 [2024-11-29 05:33:05.110102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008aef cdw11:00000000 00:07:53.985 [2024-11-29 05:33:05.110129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.985 [2024-11-29 05:33:05.110246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000efef cdw11:00000000 00:07:53.985 [2024-11-29 05:33:05.110263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.985 #10 NEW cov: 11737 ft: 13008 corp: 6/13b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:53.985 [2024-11-29 05:33:05.149980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007460 cdw11:00000000 00:07:53.985 [2024-11-29 05:33:05.150007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.985 #11 NEW cov: 11737 ft: 13087 corp: 7/15b lim: 10 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 ChangeByte- 00:07:53.985 [2024-11-29 05:33:05.190148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007060 cdw11:00000000 00:07:53.985 [2024-11-29 05:33:05.190181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.985 #12 NEW cov: 11737 ft: 13139 corp: 8/17b lim: 10 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 ChangeBinInt- 00:07:53.985 [2024-11-29 05:33:05.240732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006000 cdw11:00000000 00:07:53.985 [2024-11-29 05:33:05.240762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.985 [2024-11-29 05:33:05.240876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.985 [2024-11-29 05:33:05.240893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.985 [2024-11-29 05:33:05.240999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:53.985 [2024-11-29 05:33:05.241016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.985 #13 NEW cov: 11737 ft: 13318 corp: 9/23b lim: 10 exec/s: 0 rss: 68Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:53.985 [2024-11-29 05:33:05.280607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a2a cdw11:00000000 00:07:53.985 [2024-11-29 05:33:05.280636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.985 [2024-11-29 05:33:05.280753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000efef cdw11:00000000 00:07:53.985 [2024-11-29 05:33:05.280773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.242 #14 NEW cov: 11737 ft: 13364 corp: 10/27b lim: 10 exec/s: 0 rss: 68Mb L: 4/6 MS: 1 ChangeByte- 00:07:54.242 [2024-11-29 05:33:05.331043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006000 cdw11:00000000 00:07:54.242 [2024-11-29 05:33:05.331072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.242 [2024-11-29 05:33:05.331180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:54.242 [2024-11-29 05:33:05.331197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.242 [2024-11-29 05:33:05.331302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:54.242 [2024-11-29 05:33:05.331318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.242 #15 NEW cov: 11737 ft: 13444 corp: 11/33b lim: 10 exec/s: 0 rss: 68Mb L: 6/6 MS: 1 CrossOver- 00:07:54.242 [2024-11-29 05:33:05.380704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a30 cdw11:00000000 00:07:54.242 [2024-11-29 05:33:05.380732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.242 #16 NEW cov: 11737 ft: 13507 corp: 12/35b lim: 10 exec/s: 0 rss: 68Mb L: 2/6 MS: 1 ChangeBit- 00:07:54.242 [2024-11-29 05:33:05.430845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a51 cdw11:00000000 00:07:54.242 [2024-11-29 05:33:05.430876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.242 #17 NEW cov: 11737 ft: 13540 corp: 13/37b lim: 10 exec/s: 0 rss: 68Mb L: 2/6 MS: 1 ChangeByte- 00:07:54.242 [2024-11-29 05:33:05.470916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007027 cdw11:00000000 00:07:54.242 [2024-11-29 05:33:05.470945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.242 #18 NEW cov: 11737 ft: 13567 corp: 14/39b lim: 10 exec/s: 0 rss: 69Mb L: 2/6 MS: 1 ChangeByte- 00:07:54.242 [2024-11-29 05:33:05.521322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000c3ef cdw11:00000000 00:07:54.242 [2024-11-29 05:33:05.521350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.242 [2024-11-29 05:33:05.521459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000efef cdw11:00000000 00:07:54.242 [2024-11-29 05:33:05.521476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.242 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.243 #19 NEW cov: 11760 ft: 13651 corp: 15/43b lim: 10 exec/s: 0 rss: 69Mb L: 4/6 MS: 1 ChangeByte- 00:07:54.500 [2024-11-29 05:33:05.561263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008ca2 cdw11:00000000 00:07:54.500 [2024-11-29 05:33:05.561291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.500 #20 NEW cov: 11760 ft: 13708 corp: 16/45b lim: 10 exec/s: 0 rss: 69Mb L: 2/6 MS: 1 ChangeBinInt- 00:07:54.500 [2024-11-29 05:33:05.601830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006000 cdw11:00000000 00:07:54.500 [2024-11-29 05:33:05.601857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.500 [2024-11-29 05:33:05.601970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000041 cdw11:00000000 00:07:54.500 [2024-11-29 05:33:05.601989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.500 [2024-11-29 05:33:05.602103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:54.500 [2024-11-29 05:33:05.602120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.500 #21 NEW cov: 11760 ft: 13722 corp: 17/52b lim: 10 exec/s: 0 rss: 69Mb L: 7/7 MS: 1 InsertByte- 00:07:54.500 [2024-11-29 05:33:05.641673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008aaa cdw11:00000000 00:07:54.500 [2024-11-29 05:33:05.641701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.500 [2024-11-29 05:33:05.641817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000efef cdw11:00000000 00:07:54.500 [2024-11-29 05:33:05.641835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.500 #22 NEW cov: 11760 ft: 13784 corp: 18/56b lim: 10 exec/s: 22 rss: 69Mb L: 4/7 MS: 1 ChangeBit- 00:07:54.500 [2024-11-29 05:33:05.691624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002525 cdw11:00000000 00:07:54.500 [2024-11-29 05:33:05.691654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.500 #25 NEW cov: 11760 ft: 13794 corp: 19/58b lim: 10 exec/s: 25 rss: 69Mb L: 2/7 MS: 3 EraseBytes-ChangeByte-CopyPart- 00:07:54.500 [2024-11-29 05:33:05.732039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a24 cdw11:00000000 00:07:54.500 [2024-11-29 05:33:05.732068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.500 [2024-11-29 05:33:05.732179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008ca2 cdw11:00000000 00:07:54.500 [2024-11-29 05:33:05.732195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.500 #26 NEW cov: 11760 ft: 13873 corp: 20/62b lim: 10 exec/s: 26 rss: 69Mb L: 4/7 MS: 1 CrossOver- 00:07:54.500 [2024-11-29 05:33:05.771819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006008 cdw11:00000000 00:07:54.500 [2024-11-29 05:33:05.771847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.500 #27 NEW cov: 11760 ft: 13895 corp: 21/64b lim: 10 exec/s: 27 rss: 69Mb L: 2/7 MS: 1 ChangeBit- 00:07:54.759 [2024-11-29 05:33:05.812586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002525 cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.812616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.759 [2024-11-29 05:33:05.812722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006666 cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.812737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.759 [2024-11-29 05:33:05.812855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006666 cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.812870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.759 [2024-11-29 05:33:05.812995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006666 cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.813012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.759 #28 NEW cov: 11760 ft: 14136 corp: 22/72b lim: 10 exec/s: 28 rss: 69Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:54.759 [2024-11-29 05:33:05.852726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.852753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.759 [2024-11-29 05:33:05.852862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.852879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.759 [2024-11-29 05:33:05.852990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.853006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.759 [2024-11-29 05:33:05.853061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000a251 cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.853077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.759 #30 NEW cov: 11760 ft: 14159 corp: 23/80b lim: 10 exec/s: 30 rss: 69Mb L: 8/8 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:54.759 [2024-11-29 05:33:05.892696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000c3ef cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.892723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.759 [2024-11-29 05:33:05.892833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000efef cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.892849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.759 [2024-11-29 05:33:05.892962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000efef cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.892977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.759 #31 NEW cov: 11760 ft: 14180 corp: 24/86b lim: 10 exec/s: 31 rss: 69Mb L: 6/8 MS: 1 CopyPart- 00:07:54.759 [2024-11-29 05:33:05.932396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.932423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.759 #32 NEW cov: 11760 ft: 14215 corp: 25/89b lim: 10 exec/s: 32 rss: 69Mb L: 3/8 MS: 1 CrossOver- 00:07:54.759 [2024-11-29 05:33:05.972410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008ca2 cdw11:00000000 00:07:54.759 [2024-11-29 05:33:05.972437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.759 #38 NEW cov: 11760 ft: 14224 corp: 26/91b lim: 10 exec/s: 38 rss: 69Mb L: 2/8 MS: 1 CopyPart- 00:07:54.759 [2024-11-29 05:33:06.012581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000341a cdw11:00000000 00:07:54.759 [2024-11-29 05:33:06.012610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.759 #43 NEW cov: 11760 ft: 14245 corp: 27/93b lim: 10 exec/s: 43 rss: 69Mb L: 2/8 MS: 5 EraseBytes-ChangeBit-ChangeBit-CrossOver-InsertByte- 00:07:54.759 [2024-11-29 05:33:06.052697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007a74 cdw11:00000000 00:07:54.759 [2024-11-29 05:33:06.052722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.017 #44 NEW cov: 11760 ft: 14253 corp: 28/96b lim: 10 exec/s: 44 rss: 69Mb L: 3/8 MS: 1 InsertByte- 00:07:55.017 [2024-11-29 05:33:06.092850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008aef cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.092875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.017 #45 NEW cov: 11760 ft: 14319 corp: 29/98b lim: 10 exec/s: 45 rss: 69Mb L: 2/8 MS: 1 EraseBytes- 00:07:55.017 [2024-11-29 05:33:06.133047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000808 cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.133074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.017 #48 NEW cov: 11760 ft: 14327 corp: 30/100b lim: 10 exec/s: 48 rss: 69Mb L: 2/8 MS: 3 ChangeBit-ShuffleBytes-CopyPart- 00:07:55.017 [2024-11-29 05:33:06.163488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a25 cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.163515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.017 [2024-11-29 05:33:06.163627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000efef cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.163655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.017 [2024-11-29 05:33:06.163760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000025ef cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.163776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.017 #49 NEW cov: 11760 ft: 14424 corp: 31/106b lim: 10 exec/s: 49 rss: 69Mb L: 6/8 MS: 1 CrossOver- 00:07:55.017 [2024-11-29 05:33:06.204047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.204072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.017 [2024-11-29 05:33:06.204195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.204215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.017 [2024-11-29 05:33:06.204324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.204341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.017 [2024-11-29 05:33:06.204449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.204464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.017 [2024-11-29 05:33:06.204575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000a251 cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.204591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.017 #50 NEW cov: 11760 ft: 14463 corp: 32/116b lim: 10 exec/s: 50 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:55.017 [2024-11-29 05:33:06.243573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007460 cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.243606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.017 [2024-11-29 05:33:06.243740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a24 cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.243757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.017 #51 NEW cov: 11760 ft: 14471 corp: 33/121b lim: 10 exec/s: 51 rss: 69Mb L: 5/10 MS: 1 CrossOver- 00:07:55.017 [2024-11-29 05:33:06.283855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006000 cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.283882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.017 [2024-11-29 05:33:06.283996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.284014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.017 [2024-11-29 05:33:06.284125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:55.017 [2024-11-29 05:33:06.284142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.017 #52 NEW cov: 11760 ft: 14486 corp: 34/127b lim: 10 exec/s: 52 rss: 69Mb L: 6/10 MS: 1 ShuffleBytes- 00:07:55.275 [2024-11-29 05:33:06.324301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.324329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.275 [2024-11-29 05:33:06.324446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.324462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.275 [2024-11-29 05:33:06.324573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.324590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.275 [2024-11-29 05:33:06.324711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.324727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.275 [2024-11-29 05:33:06.324836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000a30 cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.324854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.275 #53 NEW cov: 11760 ft: 14500 corp: 35/137b lim: 10 exec/s: 53 rss: 69Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:55.275 [2024-11-29 05:33:06.363607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000342a cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.363633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.275 #55 NEW cov: 11760 ft: 14514 corp: 36/139b lim: 10 exec/s: 55 rss: 69Mb L: 2/10 MS: 2 EraseBytes-InsertByte- 00:07:55.275 [2024-11-29 05:33:06.404268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.404295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.275 [2024-11-29 05:33:06.404404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.404420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.275 [2024-11-29 05:33:06.404545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.404562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.275 #56 NEW cov: 11760 ft: 14524 corp: 37/145b lim: 10 exec/s: 56 rss: 70Mb L: 6/10 MS: 1 ShuffleBytes- 00:07:55.275 [2024-11-29 05:33:06.444258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007460 cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.444287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.275 [2024-11-29 05:33:06.444401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000efef cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.444417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.275 #57 NEW cov: 11760 ft: 14551 corp: 38/149b lim: 10 exec/s: 57 rss: 70Mb L: 4/10 MS: 1 CrossOver- 00:07:55.275 [2024-11-29 05:33:06.484082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000818 cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.484109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.275 #58 NEW cov: 11760 ft: 14554 corp: 39/151b lim: 10 exec/s: 58 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:55.275 [2024-11-29 05:33:06.524210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a11 cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.524236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.275 #59 NEW cov: 11760 ft: 14556 corp: 40/153b lim: 10 exec/s: 59 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:55.275 [2024-11-29 05:33:06.564317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ef8a cdw11:00000000 00:07:55.275 [2024-11-29 05:33:06.564343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.534 #60 NEW cov: 11760 ft: 14559 corp: 41/155b lim: 10 exec/s: 60 rss: 70Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:55.534 [2024-11-29 05:33:06.605259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:07:55.534 [2024-11-29 05:33:06.605289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.534 [2024-11-29 05:33:06.605402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:07:55.534 [2024-11-29 05:33:06.605418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.534 [2024-11-29 05:33:06.605533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a2a2 cdw11:00000000 00:07:55.534 [2024-11-29 05:33:06.605547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.534 [2024-11-29 05:33:06.605661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b7a2 cdw11:00000000 00:07:55.534 [2024-11-29 05:33:06.605678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.534 [2024-11-29 05:33:06.605803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000a251 cdw11:00000000 00:07:55.534 [2024-11-29 05:33:06.605818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.534 #61 NEW cov: 11760 ft: 14566 corp: 42/165b lim: 10 exec/s: 30 rss: 70Mb L: 10/10 MS: 1 ChangeByte- 00:07:55.534 #61 DONE cov: 11760 ft: 14566 corp: 42/165b lim: 10 exec/s: 30 rss: 70Mb 00:07:55.534 Done 61 runs in 2 second(s) 00:07:55.534 05:33:06 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:55.534 05:33:06 -- ../common.sh@72 -- # (( i++ )) 00:07:55.534 05:33:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.534 05:33:06 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:55.534 05:33:06 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:55.534 05:33:06 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.534 05:33:06 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.534 05:33:06 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:55.534 05:33:06 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:55.534 05:33:06 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:55.534 05:33:06 -- nvmf/run.sh@29 -- # port=4408 00:07:55.534 05:33:06 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:55.534 05:33:06 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:55.534 05:33:06 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.534 05:33:06 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:55.534 [2024-11-29 05:33:06.791766] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:55.534 [2024-11-29 05:33:06.791857] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2217067 ] 00:07:55.534 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.792 [2024-11-29 05:33:07.046606] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.792 [2024-11-29 05:33:07.075776] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:55.792 [2024-11-29 05:33:07.075918] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.051 [2024-11-29 05:33:07.127664] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.051 [2024-11-29 05:33:07.144024] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:56.051 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.051 INFO: Seed: 1947323797 00:07:56.051 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:56.051 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:56.051 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:56.051 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.051 [2024-11-29 05:33:07.189168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.189197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.051 #2 INITED cov: 11555 ft: 11556 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:56.051 [2024-11-29 05:33:07.219568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.219594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.051 [2024-11-29 05:33:07.219655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.219669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.051 [2024-11-29 05:33:07.219723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.219736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.051 [2024-11-29 05:33:07.219788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.219801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.051 #3 NEW cov: 11674 ft: 12902 corp: 2/5b lim: 5 exec/s: 0 rss: 65Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:56.051 [2024-11-29 05:33:07.269738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.269765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.051 [2024-11-29 05:33:07.269833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.269847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.051 [2024-11-29 05:33:07.269900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.269914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.051 [2024-11-29 05:33:07.269966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.269980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.051 #4 NEW cov: 11680 ft: 13011 corp: 3/9b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:56.051 [2024-11-29 05:33:07.309348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.309373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.051 #5 NEW cov: 11765 ft: 13279 corp: 4/10b lim: 5 exec/s: 0 rss: 66Mb L: 1/4 MS: 1 ChangeBinInt- 00:07:56.051 [2024-11-29 05:33:07.350068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.350094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.051 [2024-11-29 05:33:07.350150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.350164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.051 [2024-11-29 05:33:07.350219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.350233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.051 [2024-11-29 05:33:07.350287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.350301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.051 [2024-11-29 05:33:07.350354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.051 [2024-11-29 05:33:07.350368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.311 #6 NEW cov: 11765 ft: 13446 corp: 5/15b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertByte- 00:07:56.311 [2024-11-29 05:33:07.389567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.389593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.311 #7 NEW cov: 11765 ft: 13622 corp: 6/16b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ChangeByte- 00:07:56.311 [2024-11-29 05:33:07.430186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.430212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.430265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.430280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.430332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.430345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.430398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.430412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.311 #8 NEW cov: 11765 ft: 13699 corp: 7/20b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 CrossOver- 00:07:56.311 [2024-11-29 05:33:07.480262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.480291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.480360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.480374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.480427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.480440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.480495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.480508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.311 #9 NEW cov: 11765 ft: 13727 corp: 8/24b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 ShuffleBytes- 00:07:56.311 [2024-11-29 05:33:07.520561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.520586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.520644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.520657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.520711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.520724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.520776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.520789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.520841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.520854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.311 #10 NEW cov: 11765 ft: 13772 corp: 9/29b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 CrossOver- 00:07:56.311 [2024-11-29 05:33:07.560653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.560679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.560723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.560737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.560790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.560822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.560877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.560890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.311 [2024-11-29 05:33:07.560942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.560956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.311 #11 NEW cov: 11765 ft: 13815 corp: 10/34b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertByte- 00:07:56.311 [2024-11-29 05:33:07.600151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.311 [2024-11-29 05:33:07.600176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.570 #12 NEW cov: 11765 ft: 13884 corp: 11/35b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeByte- 00:07:56.570 [2024-11-29 05:33:07.640883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.640908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.570 [2024-11-29 05:33:07.640963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.640977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.570 [2024-11-29 05:33:07.641032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.641046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.570 [2024-11-29 05:33:07.641102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.641116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.570 [2024-11-29 05:33:07.641172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.641185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.570 #13 NEW cov: 11765 ft: 13974 corp: 12/40b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 CopyPart- 00:07:56.570 [2024-11-29 05:33:07.680834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.680859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.570 [2024-11-29 05:33:07.680910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.680924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.570 [2024-11-29 05:33:07.680975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.680991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.570 [2024-11-29 05:33:07.681043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.681057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.570 #14 NEW cov: 11765 ft: 13988 corp: 13/44b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 CrossOver- 00:07:56.570 [2024-11-29 05:33:07.720516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.720541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.570 #15 NEW cov: 11765 ft: 14028 corp: 14/45b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:56.570 [2024-11-29 05:33:07.761115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.761140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.570 [2024-11-29 05:33:07.761193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.570 [2024-11-29 05:33:07.761207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.571 [2024-11-29 05:33:07.761259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.571 [2024-11-29 05:33:07.761273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.571 [2024-11-29 05:33:07.761325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.571 [2024-11-29 05:33:07.761338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.571 #16 NEW cov: 11765 ft: 14042 corp: 15/49b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 CopyPart- 00:07:56.571 [2024-11-29 05:33:07.800926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.571 [2024-11-29 05:33:07.800951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.571 [2024-11-29 05:33:07.801022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.571 [2024-11-29 05:33:07.801036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.571 #17 NEW cov: 11765 ft: 14271 corp: 16/51b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 InsertByte- 00:07:56.571 [2024-11-29 05:33:07.841498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.571 [2024-11-29 05:33:07.841523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.571 [2024-11-29 05:33:07.841577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.571 [2024-11-29 05:33:07.841591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.571 [2024-11-29 05:33:07.841651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.571 [2024-11-29 05:33:07.841681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.571 [2024-11-29 05:33:07.841738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.571 [2024-11-29 05:33:07.841751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.571 [2024-11-29 05:33:07.841805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.571 [2024-11-29 05:33:07.841819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.571 #18 NEW cov: 11765 ft: 14274 corp: 17/56b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeByte- 00:07:56.829 [2024-11-29 05:33:07.881596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.881631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.881689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.881703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.881761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.881774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.881831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.881844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.881900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.881913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.829 #19 NEW cov: 11765 ft: 14319 corp: 18/61b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertByte- 00:07:56.829 [2024-11-29 05:33:07.921709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.921734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.921804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.921818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.921872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.921885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.921940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.921954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.922007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.922020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.829 #20 NEW cov: 11765 ft: 14338 corp: 19/66b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertByte- 00:07:56.829 [2024-11-29 05:33:07.961803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.961829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.961900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.961914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.961968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.961982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.962036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.962049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:07.962103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:07.962116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.829 #21 NEW cov: 11765 ft: 14366 corp: 20/71b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 CMP- DE: "\000\016"- 00:07:56.829 [2024-11-29 05:33:08.001821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:08.001847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:08.001901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:08.001914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:08.001968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:08.001982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.829 [2024-11-29 05:33:08.002034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:08.002047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.829 #22 NEW cov: 11765 ft: 14385 corp: 21/75b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 ChangeByte- 00:07:56.829 [2024-11-29 05:33:08.042072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.829 [2024-11-29 05:33:08.042097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.830 [2024-11-29 05:33:08.042165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.830 [2024-11-29 05:33:08.042178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.830 [2024-11-29 05:33:08.042234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.830 [2024-11-29 05:33:08.042248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.830 [2024-11-29 05:33:08.042300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.830 [2024-11-29 05:33:08.042312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.830 [2024-11-29 05:33:08.042364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.830 [2024-11-29 05:33:08.042377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.830 #23 NEW cov: 11765 ft: 14409 corp: 22/80b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeByte- 00:07:56.830 [2024-11-29 05:33:08.082069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.830 [2024-11-29 05:33:08.082093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.830 [2024-11-29 05:33:08.082164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.830 [2024-11-29 05:33:08.082178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.830 [2024-11-29 05:33:08.082231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.830 [2024-11-29 05:33:08.082244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.830 [2024-11-29 05:33:08.082296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.830 [2024-11-29 05:33:08.082309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.087 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.087 #24 NEW cov: 11788 ft: 14501 corp: 23/84b lim: 5 exec/s: 24 rss: 68Mb L: 4/5 MS: 1 CrossOver- 00:07:57.087 [2024-11-29 05:33:08.372464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.087 [2024-11-29 05:33:08.372504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.344 #25 NEW cov: 11788 ft: 14606 corp: 24/85b lim: 5 exec/s: 25 rss: 68Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:57.345 [2024-11-29 05:33:08.412745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.412776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.412848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.412862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.412918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.412931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.412986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.412999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.345 #26 NEW cov: 11788 ft: 14616 corp: 25/89b lim: 5 exec/s: 26 rss: 68Mb L: 4/5 MS: 1 ChangeByte- 00:07:57.345 [2024-11-29 05:33:08.443054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.443080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.443150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.443164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.443218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.443231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.443282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.443295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.443349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.443362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.345 #27 NEW cov: 11788 ft: 14638 corp: 26/94b lim: 5 exec/s: 27 rss: 68Mb L: 5/5 MS: 1 CopyPart- 00:07:57.345 [2024-11-29 05:33:08.482874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.482900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.482970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.482985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.483042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.483059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.345 #28 NEW cov: 11788 ft: 14799 corp: 27/97b lim: 5 exec/s: 28 rss: 68Mb L: 3/5 MS: 1 EraseBytes- 00:07:57.345 [2024-11-29 05:33:08.522893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.522919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.522991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.523005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.345 #29 NEW cov: 11788 ft: 14816 corp: 28/99b lim: 5 exec/s: 29 rss: 68Mb L: 2/5 MS: 1 CrossOver- 00:07:57.345 [2024-11-29 05:33:08.563311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.563336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.563406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.563421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.563473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.563486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.563539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.563553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.345 #30 NEW cov: 11788 ft: 14891 corp: 29/103b lim: 5 exec/s: 30 rss: 68Mb L: 4/5 MS: 1 ChangeBit- 00:07:57.345 [2024-11-29 05:33:08.603257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.603283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.603340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.603354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.603408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.603421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.345 #31 NEW cov: 11788 ft: 14898 corp: 30/106b lim: 5 exec/s: 31 rss: 68Mb L: 3/5 MS: 1 PersAutoDict- DE: "\000\016"- 00:07:57.345 [2024-11-29 05:33:08.643278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.643305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.345 [2024-11-29 05:33:08.643364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.345 [2024-11-29 05:33:08.643378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.603 #32 NEW cov: 11788 ft: 14903 corp: 31/108b lim: 5 exec/s: 32 rss: 68Mb L: 2/5 MS: 1 EraseBytes- 00:07:57.603 [2024-11-29 05:33:08.683544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.683570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.603 [2024-11-29 05:33:08.683633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.683647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.603 [2024-11-29 05:33:08.683700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.683714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.603 #33 NEW cov: 11788 ft: 14948 corp: 32/111b lim: 5 exec/s: 33 rss: 69Mb L: 3/5 MS: 1 EraseBytes- 00:07:57.603 [2024-11-29 05:33:08.723300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.723324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.603 #34 NEW cov: 11788 ft: 15006 corp: 33/112b lim: 5 exec/s: 34 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:57.603 [2024-11-29 05:33:08.753798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.753826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.603 [2024-11-29 05:33:08.753879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.753892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.603 [2024-11-29 05:33:08.753943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.753957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.603 [2024-11-29 05:33:08.753999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.754013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.603 #35 NEW cov: 11788 ft: 15054 corp: 34/116b lim: 5 exec/s: 35 rss: 69Mb L: 4/5 MS: 1 ChangeBit- 00:07:57.603 [2024-11-29 05:33:08.794037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.794063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.603 [2024-11-29 05:33:08.794125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.794142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.603 [2024-11-29 05:33:08.794197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.794211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.603 [2024-11-29 05:33:08.794265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.794279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.603 #36 NEW cov: 11788 ft: 15058 corp: 35/120b lim: 5 exec/s: 36 rss: 69Mb L: 4/5 MS: 1 ChangeBit- 00:07:57.603 [2024-11-29 05:33:08.834198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.834223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.603 [2024-11-29 05:33:08.834292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.834306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.603 [2024-11-29 05:33:08.834357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.603 [2024-11-29 05:33:08.834371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.604 [2024-11-29 05:33:08.834400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.604 [2024-11-29 05:33:08.834413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.604 [2024-11-29 05:33:08.834466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.604 [2024-11-29 05:33:08.834479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.604 #37 NEW cov: 11788 ft: 15070 corp: 36/125b lim: 5 exec/s: 37 rss: 69Mb L: 5/5 MS: 1 CrossOver- 00:07:57.604 [2024-11-29 05:33:08.874174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.604 [2024-11-29 05:33:08.874199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.604 [2024-11-29 05:33:08.874268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.604 [2024-11-29 05:33:08.874281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.604 [2024-11-29 05:33:08.874331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.604 [2024-11-29 05:33:08.874344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.604 [2024-11-29 05:33:08.874394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.604 [2024-11-29 05:33:08.874410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.604 #38 NEW cov: 11788 ft: 15080 corp: 37/129b lim: 5 exec/s: 38 rss: 69Mb L: 4/5 MS: 1 ChangeByte- 00:07:57.862 [2024-11-29 05:33:08.913842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:08.913868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.862 #39 NEW cov: 11788 ft: 15158 corp: 38/130b lim: 5 exec/s: 39 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:57.862 [2024-11-29 05:33:08.954400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:08.954426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:08.954479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:08.954493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:08.954544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:08.954556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:08.954613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:08.954626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.862 #40 NEW cov: 11788 ft: 15164 corp: 39/134b lim: 5 exec/s: 40 rss: 69Mb L: 4/5 MS: 1 CMP- DE: "\000\010"- 00:07:57.862 [2024-11-29 05:33:08.994534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:08.994560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:08.994632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:08.994647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:08.994699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:08.994712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:08.994762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:08.994776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.862 #41 NEW cov: 11788 ft: 15182 corp: 40/138b lim: 5 exec/s: 41 rss: 69Mb L: 4/5 MS: 1 ShuffleBytes- 00:07:57.862 [2024-11-29 05:33:09.034732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.034757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:09.034828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.034842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:09.034894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.034907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:09.034959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.034972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:09.035024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.035038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.862 #42 NEW cov: 11788 ft: 15195 corp: 41/143b lim: 5 exec/s: 42 rss: 69Mb L: 5/5 MS: 1 ChangeByte- 00:07:57.862 [2024-11-29 05:33:09.074459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.074484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:09.074550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.074564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.862 #43 NEW cov: 11788 ft: 15204 corp: 42/145b lim: 5 exec/s: 43 rss: 69Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:57.862 [2024-11-29 05:33:09.114550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.114574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:09.114651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.114666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.862 #44 NEW cov: 11788 ft: 15211 corp: 43/147b lim: 5 exec/s: 44 rss: 69Mb L: 2/5 MS: 1 CrossOver- 00:07:57.862 [2024-11-29 05:33:09.154855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.154880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:09.154951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.154965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.862 [2024-11-29 05:33:09.155020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.862 [2024-11-29 05:33:09.155034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.120 #45 NEW cov: 11788 ft: 15229 corp: 44/150b lim: 5 exec/s: 22 rss: 69Mb L: 3/5 MS: 1 EraseBytes- 00:07:58.120 #45 DONE cov: 11788 ft: 15229 corp: 44/150b lim: 5 exec/s: 22 rss: 69Mb 00:07:58.120 ###### Recommended dictionary. ###### 00:07:58.121 "\000\016" # Uses: 1 00:07:58.121 "\000\010" # Uses: 0 00:07:58.121 ###### End of recommended dictionary. ###### 00:07:58.121 Done 45 runs in 2 second(s) 00:07:58.121 05:33:09 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:58.121 05:33:09 -- ../common.sh@72 -- # (( i++ )) 00:07:58.121 05:33:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.121 05:33:09 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:58.121 05:33:09 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:58.121 05:33:09 -- nvmf/run.sh@24 -- # local timen=1 00:07:58.121 05:33:09 -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.121 05:33:09 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:58.121 05:33:09 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:58.121 05:33:09 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:58.121 05:33:09 -- nvmf/run.sh@29 -- # port=4409 00:07:58.121 05:33:09 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:58.121 05:33:09 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:58.121 05:33:09 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.121 05:33:09 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:58.121 [2024-11-29 05:33:09.339811] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:58.121 [2024-11-29 05:33:09.339901] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2217598 ] 00:07:58.121 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.378 [2024-11-29 05:33:09.590807] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.378 [2024-11-29 05:33:09.619624] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:58.378 [2024-11-29 05:33:09.619744] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.378 [2024-11-29 05:33:09.670981] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.635 [2024-11-29 05:33:09.687348] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:58.635 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.635 INFO: Seed: 194352940 00:07:58.635 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:58.635 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:58.635 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:58.635 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.635 [2024-11-29 05:33:09.736043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.635 [2024-11-29 05:33:09.736070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.635 #2 INITED cov: 11561 ft: 11562 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:58.635 [2024-11-29 05:33:09.766213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.635 [2024-11-29 05:33:09.766238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.635 [2024-11-29 05:33:09.766297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.635 [2024-11-29 05:33:09.766311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.635 #3 NEW cov: 11674 ft: 12788 corp: 2/3b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:07:58.635 [2024-11-29 05:33:09.816361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.635 [2024-11-29 05:33:09.816386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.635 [2024-11-29 05:33:09.816442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.635 [2024-11-29 05:33:09.816455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.635 #4 NEW cov: 11680 ft: 12997 corp: 3/5b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 ChangeBit- 00:07:58.635 [2024-11-29 05:33:09.856463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.635 [2024-11-29 05:33:09.856489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.635 [2024-11-29 05:33:09.856562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.636 [2024-11-29 05:33:09.856578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.636 #5 NEW cov: 11765 ft: 13311 corp: 4/7b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CrossOver- 00:07:58.636 [2024-11-29 05:33:09.896620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.636 [2024-11-29 05:33:09.896647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.636 [2024-11-29 05:33:09.896707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.636 [2024-11-29 05:33:09.896721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.636 #6 NEW cov: 11765 ft: 13397 corp: 5/9b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:58.636 [2024-11-29 05:33:09.936775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.636 [2024-11-29 05:33:09.936801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.636 [2024-11-29 05:33:09.936858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.636 [2024-11-29 05:33:09.936872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.893 #7 NEW cov: 11765 ft: 13418 corp: 6/11b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:07:58.893 [2024-11-29 05:33:09.976835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:09.976861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.893 [2024-11-29 05:33:09.976916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:09.976933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.893 #8 NEW cov: 11765 ft: 13539 corp: 7/13b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 ChangeBit- 00:07:58.893 [2024-11-29 05:33:10.016851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:10.016877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.893 #9 NEW cov: 11765 ft: 13593 corp: 8/14b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:58.893 [2024-11-29 05:33:10.057076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:10.057103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.893 [2024-11-29 05:33:10.057162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:10.057176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.893 #10 NEW cov: 11765 ft: 13650 corp: 9/16b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 ChangeBit- 00:07:58.893 [2024-11-29 05:33:10.097246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:10.097275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.893 [2024-11-29 05:33:10.097332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:10.097347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.893 #11 NEW cov: 11765 ft: 13754 corp: 10/18b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 ChangeBinInt- 00:07:58.893 [2024-11-29 05:33:10.137553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:10.137580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.893 [2024-11-29 05:33:10.137639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:10.137654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.893 [2024-11-29 05:33:10.137707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:10.137722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.893 #12 NEW cov: 11765 ft: 13966 corp: 11/21b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 InsertByte- 00:07:58.893 [2024-11-29 05:33:10.187521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:10.187549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.893 [2024-11-29 05:33:10.187610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.893 [2024-11-29 05:33:10.187627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.150 #13 NEW cov: 11765 ft: 13974 corp: 12/23b lim: 5 exec/s: 0 rss: 66Mb L: 2/3 MS: 1 ChangeByte- 00:07:59.150 [2024-11-29 05:33:10.227402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.227429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.150 #14 NEW cov: 11765 ft: 14008 corp: 13/24b lim: 5 exec/s: 0 rss: 66Mb L: 1/3 MS: 1 ChangeBit- 00:07:59.150 [2024-11-29 05:33:10.267739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.267765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.150 [2024-11-29 05:33:10.267823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.267836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.150 #15 NEW cov: 11765 ft: 14074 corp: 14/26b lim: 5 exec/s: 0 rss: 66Mb L: 2/3 MS: 1 ChangeBit- 00:07:59.150 [2024-11-29 05:33:10.307805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.307830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.150 [2024-11-29 05:33:10.307901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.307915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.150 #16 NEW cov: 11765 ft: 14103 corp: 15/28b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 ChangeBit- 00:07:59.150 [2024-11-29 05:33:10.348056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.348082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.150 [2024-11-29 05:33:10.348157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.348172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.150 [2024-11-29 05:33:10.348229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.348243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.150 #17 NEW cov: 11765 ft: 14108 corp: 16/31b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 InsertByte- 00:07:59.150 [2024-11-29 05:33:10.387898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.387922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.150 #18 NEW cov: 11765 ft: 14115 corp: 17/32b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 CopyPart- 00:07:59.150 [2024-11-29 05:33:10.428445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.428474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.150 [2024-11-29 05:33:10.428529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.428543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.150 [2024-11-29 05:33:10.428600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.428614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.150 [2024-11-29 05:33:10.428671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.150 [2024-11-29 05:33:10.428685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.408 #19 NEW cov: 11765 ft: 14380 corp: 18/36b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 InsertByte- 00:07:59.408 [2024-11-29 05:33:10.478294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.408 [2024-11-29 05:33:10.478319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.409 [2024-11-29 05:33:10.478388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.409 [2024-11-29 05:33:10.478402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.409 #20 NEW cov: 11765 ft: 14391 corp: 19/38b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 CrossOver- 00:07:59.409 [2024-11-29 05:33:10.518224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.409 [2024-11-29 05:33:10.518250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.409 #21 NEW cov: 11765 ft: 14411 corp: 20/39b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ChangeByte- 00:07:59.409 [2024-11-29 05:33:10.558532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.409 [2024-11-29 05:33:10.558559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.409 [2024-11-29 05:33:10.558618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.409 [2024-11-29 05:33:10.558632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.409 #22 NEW cov: 11765 ft: 14502 corp: 21/41b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 ChangeByte- 00:07:59.409 [2024-11-29 05:33:10.598682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.409 [2024-11-29 05:33:10.598707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.409 [2024-11-29 05:33:10.598764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.409 [2024-11-29 05:33:10.598778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.667 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:59.667 #23 NEW cov: 11788 ft: 14545 corp: 22/43b lim: 5 exec/s: 23 rss: 68Mb L: 2/4 MS: 1 ChangeByte- 00:07:59.667 [2024-11-29 05:33:10.889767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.667 [2024-11-29 05:33:10.889807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.667 [2024-11-29 05:33:10.889868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.667 [2024-11-29 05:33:10.889886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.667 [2024-11-29 05:33:10.889945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.667 [2024-11-29 05:33:10.889963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.667 [2024-11-29 05:33:10.890022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.667 [2024-11-29 05:33:10.890039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.667 #24 NEW cov: 11788 ft: 14572 corp: 23/47b lim: 5 exec/s: 24 rss: 68Mb L: 4/4 MS: 1 InsertByte- 00:07:59.667 [2024-11-29 05:33:10.939651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.667 [2024-11-29 05:33:10.939677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.667 [2024-11-29 05:33:10.939733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.667 [2024-11-29 05:33:10.939748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.667 [2024-11-29 05:33:10.939802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.667 [2024-11-29 05:33:10.939815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.667 #25 NEW cov: 11788 ft: 14597 corp: 24/50b lim: 5 exec/s: 25 rss: 68Mb L: 3/4 MS: 1 CopyPart- 00:07:59.926 [2024-11-29 05:33:10.979733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.926 [2024-11-29 05:33:10.979758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.926 [2024-11-29 05:33:10.979829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.926 [2024-11-29 05:33:10.979843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.926 [2024-11-29 05:33:10.979899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.926 [2024-11-29 05:33:10.979913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.926 #26 NEW cov: 11788 ft: 14624 corp: 25/53b lim: 5 exec/s: 26 rss: 68Mb L: 3/4 MS: 1 InsertByte- 00:07:59.927 [2024-11-29 05:33:11.019831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.019859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.927 [2024-11-29 05:33:11.019912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.019926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.927 [2024-11-29 05:33:11.019979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.019993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.927 #27 NEW cov: 11788 ft: 14675 corp: 26/56b lim: 5 exec/s: 27 rss: 68Mb L: 3/4 MS: 1 CrossOver- 00:07:59.927 [2024-11-29 05:33:11.059810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.059835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.927 [2024-11-29 05:33:11.059889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.059918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.927 #28 NEW cov: 11788 ft: 14742 corp: 27/58b lim: 5 exec/s: 28 rss: 68Mb L: 2/4 MS: 1 ChangeBit- 00:07:59.927 [2024-11-29 05:33:11.099919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.099945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.927 [2024-11-29 05:33:11.099997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.100011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.927 #29 NEW cov: 11788 ft: 14778 corp: 28/60b lim: 5 exec/s: 29 rss: 68Mb L: 2/4 MS: 1 ChangeByte- 00:07:59.927 [2024-11-29 05:33:11.140189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.140213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.927 [2024-11-29 05:33:11.140269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.140283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.927 [2024-11-29 05:33:11.140335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.140365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.927 #30 NEW cov: 11788 ft: 14788 corp: 29/63b lim: 5 exec/s: 30 rss: 68Mb L: 3/4 MS: 1 InsertByte- 00:07:59.927 [2024-11-29 05:33:11.179965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.179989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.927 #31 NEW cov: 11788 ft: 14803 corp: 30/64b lim: 5 exec/s: 31 rss: 68Mb L: 1/4 MS: 1 EraseBytes- 00:07:59.927 [2024-11-29 05:33:11.220270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.220297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.927 [2024-11-29 05:33:11.220352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.927 [2024-11-29 05:33:11.220365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.186 #32 NEW cov: 11788 ft: 14819 corp: 31/66b lim: 5 exec/s: 32 rss: 69Mb L: 2/4 MS: 1 ChangeBit- 00:08:00.186 [2024-11-29 05:33:11.260520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.260545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.186 [2024-11-29 05:33:11.260617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.260631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.186 [2024-11-29 05:33:11.260683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.260697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.186 #33 NEW cov: 11788 ft: 14833 corp: 32/69b lim: 5 exec/s: 33 rss: 69Mb L: 3/4 MS: 1 InsertByte- 00:08:00.186 [2024-11-29 05:33:11.300647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.300673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.186 [2024-11-29 05:33:11.300729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.300743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.186 [2024-11-29 05:33:11.300797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.300811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.186 #34 NEW cov: 11788 ft: 14861 corp: 33/72b lim: 5 exec/s: 34 rss: 69Mb L: 3/4 MS: 1 CopyPart- 00:08:00.186 [2024-11-29 05:33:11.340799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.340826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.186 [2024-11-29 05:33:11.340880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.340893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.186 [2024-11-29 05:33:11.340946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.340963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.186 #35 NEW cov: 11788 ft: 14871 corp: 34/75b lim: 5 exec/s: 35 rss: 69Mb L: 3/4 MS: 1 ChangeBit- 00:08:00.186 [2024-11-29 05:33:11.381032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.381059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.186 [2024-11-29 05:33:11.381114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.381127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.186 [2024-11-29 05:33:11.381181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.381195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.186 [2024-11-29 05:33:11.381248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.186 [2024-11-29 05:33:11.381261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.186 #36 NEW cov: 11788 ft: 14882 corp: 35/79b lim: 5 exec/s: 36 rss: 69Mb L: 4/4 MS: 1 ChangeBit- 00:08:00.187 [2024-11-29 05:33:11.431028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.187 [2024-11-29 05:33:11.431054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.187 [2024-11-29 05:33:11.431126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.187 [2024-11-29 05:33:11.431141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.187 [2024-11-29 05:33:11.431198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.187 [2024-11-29 05:33:11.431212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.187 #37 NEW cov: 11788 ft: 14898 corp: 36/82b lim: 5 exec/s: 37 rss: 69Mb L: 3/4 MS: 1 ChangeBit- 00:08:00.187 [2024-11-29 05:33:11.470988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.187 [2024-11-29 05:33:11.471013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.187 [2024-11-29 05:33:11.471068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.187 [2024-11-29 05:33:11.471082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.446 #38 NEW cov: 11788 ft: 14907 corp: 37/84b lim: 5 exec/s: 38 rss: 69Mb L: 2/4 MS: 1 ChangeBit- 00:08:00.446 [2024-11-29 05:33:11.511069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.446 [2024-11-29 05:33:11.511095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.446 [2024-11-29 05:33:11.511153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.446 [2024-11-29 05:33:11.511165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.446 #39 NEW cov: 11788 ft: 14924 corp: 38/86b lim: 5 exec/s: 39 rss: 69Mb L: 2/4 MS: 1 ChangeByte- 00:08:00.446 [2024-11-29 05:33:11.551339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.446 [2024-11-29 05:33:11.551364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.446 [2024-11-29 05:33:11.551437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.446 [2024-11-29 05:33:11.551451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.446 [2024-11-29 05:33:11.551506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.446 [2024-11-29 05:33:11.551520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.446 #40 NEW cov: 11788 ft: 14936 corp: 39/89b lim: 5 exec/s: 40 rss: 69Mb L: 3/4 MS: 1 InsertByte- 00:08:00.446 [2024-11-29 05:33:11.591160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.446 [2024-11-29 05:33:11.591186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.446 #41 NEW cov: 11788 ft: 14968 corp: 40/90b lim: 5 exec/s: 41 rss: 69Mb L: 1/4 MS: 1 EraseBytes- 00:08:00.446 [2024-11-29 05:33:11.631871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.447 [2024-11-29 05:33:11.631897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.447 [2024-11-29 05:33:11.631951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.447 [2024-11-29 05:33:11.631965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.447 [2024-11-29 05:33:11.632019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.447 [2024-11-29 05:33:11.632032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.447 [2024-11-29 05:33:11.632084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.447 [2024-11-29 05:33:11.632097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.447 [2024-11-29 05:33:11.632149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.447 [2024-11-29 05:33:11.632162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.447 #42 NEW cov: 11788 ft: 15120 corp: 41/95b lim: 5 exec/s: 42 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:08:00.447 [2024-11-29 05:33:11.671537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.447 [2024-11-29 05:33:11.671566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.447 [2024-11-29 05:33:11.671624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.447 [2024-11-29 05:33:11.671639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.447 #43 NEW cov: 11788 ft: 15126 corp: 42/97b lim: 5 exec/s: 43 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:08:00.447 [2024-11-29 05:33:11.711647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.447 [2024-11-29 05:33:11.711673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.447 [2024-11-29 05:33:11.711728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.447 [2024-11-29 05:33:11.711742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.447 #44 NEW cov: 11788 ft: 15134 corp: 43/99b lim: 5 exec/s: 22 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:08:00.447 #44 DONE cov: 11788 ft: 15134 corp: 43/99b lim: 5 exec/s: 22 rss: 70Mb 00:08:00.447 Done 44 runs in 2 second(s) 00:08:00.706 05:33:11 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:08:00.706 05:33:11 -- ../common.sh@72 -- # (( i++ )) 00:08:00.706 05:33:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.706 05:33:11 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:00.706 05:33:11 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:00.706 05:33:11 -- nvmf/run.sh@24 -- # local timen=1 00:08:00.706 05:33:11 -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.706 05:33:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:00.706 05:33:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:00.706 05:33:11 -- nvmf/run.sh@29 -- # printf %02d 10 00:08:00.706 05:33:11 -- nvmf/run.sh@29 -- # port=4410 00:08:00.706 05:33:11 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:00.706 05:33:11 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:00.706 05:33:11 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.706 05:33:11 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:08:00.706 [2024-11-29 05:33:11.877993] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:00.706 [2024-11-29 05:33:11.878059] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2218121 ] 00:08:00.706 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.966 [2024-11-29 05:33:12.048391] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.966 [2024-11-29 05:33:12.067812] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.966 [2024-11-29 05:33:12.067936] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.966 [2024-11-29 05:33:12.119437] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.966 [2024-11-29 05:33:12.135813] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:00.966 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.966 INFO: Seed: 2645359111 00:08:00.966 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:00.966 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:00.966 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:00.966 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.966 #2 INITED exec/s: 0 rss: 60Mb 00:08:00.966 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.966 This may also happen if the target rejected all inputs we tried so far 00:08:00.966 [2024-11-29 05:33:12.180359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0093af89 cdw11:1ea7f3c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.966 [2024-11-29 05:33:12.180394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.225 NEW_FUNC[1/670]: 0x45e248 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:01.225 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.225 #11 NEW cov: 11584 ft: 11570 corp: 2/11b lim: 40 exec/s: 0 rss: 68Mb L: 10/10 MS: 4 ShuffleBytes-CopyPart-CrossOver-CMP- DE: "\000\223\257\211\036\247\363\302"- 00:08:01.225 [2024-11-29 05:33:12.501126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0093af89 cdw11:1ea732c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.225 [2024-11-29 05:33:12.501164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.484 #17 NEW cov: 11697 ft: 11963 corp: 3/21b lim: 40 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeByte- 00:08:01.484 [2024-11-29 05:33:12.571200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1ea732c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.484 [2024-11-29 05:33:12.571232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.484 #18 NEW cov: 11703 ft: 12167 corp: 4/31b lim: 40 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeByte- 00:08:01.484 [2024-11-29 05:33:12.631523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.484 [2024-11-29 05:33:12.631554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.484 [2024-11-29 05:33:12.631610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.484 [2024-11-29 05:33:12.631627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.484 [2024-11-29 05:33:12.631657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.484 [2024-11-29 05:33:12.631673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.484 [2024-11-29 05:33:12.631703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.484 [2024-11-29 05:33:12.631719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.484 #22 NEW cov: 11788 ft: 13142 corp: 5/67b lim: 40 exec/s: 0 rss: 68Mb L: 36/36 MS: 4 CrossOver-CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:01.484 [2024-11-29 05:33:12.691488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00934189 cdw11:1e0a0ab5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.484 [2024-11-29 05:33:12.691517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.484 #25 NEW cov: 11788 ft: 13384 corp: 6/75b lim: 40 exec/s: 0 rss: 68Mb L: 8/36 MS: 3 EraseBytes-ChangeByte-InsertByte- 00:08:01.484 [2024-11-29 05:33:12.741627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00934189 cdw11:1ef80ab5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.484 [2024-11-29 05:33:12.741656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.742 #26 NEW cov: 11788 ft: 13479 corp: 7/83b lim: 40 exec/s: 0 rss: 68Mb L: 8/36 MS: 1 ChangeBinInt- 00:08:01.742 [2024-11-29 05:33:12.801773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1e65a732 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-11-29 05:33:12.801803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.742 #27 NEW cov: 11788 ft: 13552 corp: 8/94b lim: 40 exec/s: 0 rss: 68Mb L: 11/36 MS: 1 InsertByte- 00:08:01.742 [2024-11-29 05:33:12.861930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000001 cdw11:1ea7f3c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-11-29 05:33:12.861960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.742 #28 NEW cov: 11788 ft: 13631 corp: 9/104b lim: 40 exec/s: 0 rss: 68Mb L: 10/36 MS: 1 CMP- DE: "\001\000\000\001"- 00:08:01.742 [2024-11-29 05:33:12.912137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0093af89 cdw11:1ea7327a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-11-29 05:33:12.912167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.742 [2024-11-29 05:33:12.912215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:7a7a7a7a cdw11:7a7a7ac2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-11-29 05:33:12.912231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.742 #29 NEW cov: 11788 ft: 13884 corp: 10/122b lim: 40 exec/s: 0 rss: 68Mb L: 18/36 MS: 1 InsertRepeatedBytes- 00:08:01.742 [2024-11-29 05:33:12.962184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0093a889 cdw11:1ea7f3c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-11-29 05:33:12.962214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.742 #30 NEW cov: 11788 ft: 13906 corp: 11/132b lim: 40 exec/s: 0 rss: 68Mb L: 10/36 MS: 1 ChangeBinInt- 00:08:01.742 [2024-11-29 05:33:13.012346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1e65a731 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.742 [2024-11-29 05:33:13.012376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.001 #31 NEW cov: 11788 ft: 13962 corp: 12/143b lim: 40 exec/s: 0 rss: 68Mb L: 11/36 MS: 1 ChangeASCIIInt- 00:08:02.001 [2024-11-29 05:33:13.072506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0093a889 cdw11:1ea7f389 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-11-29 05:33:13.072535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.001 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:02.001 #32 NEW cov: 11805 ft: 14015 corp: 13/157b lim: 40 exec/s: 0 rss: 69Mb L: 14/36 MS: 1 CopyPart- 00:08:02.001 [2024-11-29 05:33:13.142701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0093a8a7 cdw11:f3c20a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.001 [2024-11-29 05:33:13.142731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.002 #33 NEW cov: 11805 ft: 14059 corp: 14/165b lim: 40 exec/s: 33 rss: 69Mb L: 8/36 MS: 1 EraseBytes- 00:08:02.002 [2024-11-29 05:33:13.202865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00933160 cdw11:891ea732 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.002 [2024-11-29 05:33:13.202895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.002 #34 NEW cov: 11805 ft: 14137 corp: 15/176b lim: 40 exec/s: 34 rss: 69Mb L: 11/36 MS: 1 InsertByte- 00:08:02.002 [2024-11-29 05:33:13.252958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1b65a731 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.002 [2024-11-29 05:33:13.252988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.002 #35 NEW cov: 11805 ft: 14183 corp: 16/187b lim: 40 exec/s: 35 rss: 69Mb L: 11/36 MS: 1 ChangeBinInt- 00:08:02.259 [2024-11-29 05:33:13.313153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1e32a732 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.260 [2024-11-29 05:33:13.313183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.260 #36 NEW cov: 11805 ft: 14218 corp: 17/198b lim: 40 exec/s: 36 rss: 69Mb L: 11/36 MS: 1 CopyPart- 00:08:02.260 [2024-11-29 05:33:13.363241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0093a889 cdw11:1ea7f3c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.260 [2024-11-29 05:33:13.363270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.260 #37 NEW cov: 11805 ft: 14263 corp: 18/206b lim: 40 exec/s: 37 rss: 69Mb L: 8/36 MS: 1 EraseBytes- 00:08:02.260 [2024-11-29 05:33:13.413371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:89609300 cdw11:651ba731 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.260 [2024-11-29 05:33:13.413401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.260 #38 NEW cov: 11805 ft: 14289 corp: 19/217b lim: 40 exec/s: 38 rss: 69Mb L: 11/36 MS: 1 ShuffleBytes- 00:08:02.260 [2024-11-29 05:33:13.473584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e5e5e500 cdw11:9360891b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.260 [2024-11-29 05:33:13.473622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.260 #39 NEW cov: 11805 ft: 14369 corp: 20/231b lim: 40 exec/s: 39 rss: 69Mb L: 14/36 MS: 1 InsertRepeatedBytes- 00:08:02.260 [2024-11-29 05:33:13.523660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:891e65a7 cdw11:32c20a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.260 [2024-11-29 05:33:13.523690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.260 #40 NEW cov: 11805 ft: 14432 corp: 21/239b lim: 40 exec/s: 40 rss: 69Mb L: 8/36 MS: 1 EraseBytes- 00:08:02.518 [2024-11-29 05:33:13.573822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0093af89 cdw11:1ea7f30a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.518 [2024-11-29 05:33:13.573853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.518 #41 NEW cov: 11805 ft: 14441 corp: 22/251b lim: 40 exec/s: 41 rss: 69Mb L: 12/36 MS: 1 CopyPart- 00:08:02.518 [2024-11-29 05:33:13.623928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1e65a732 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.518 [2024-11-29 05:33:13.623960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.518 #42 NEW cov: 11805 ft: 14504 corp: 23/263b lim: 40 exec/s: 42 rss: 69Mb L: 12/36 MS: 1 InsertByte- 00:08:02.518 [2024-11-29 05:33:13.674088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1e65a7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.518 [2024-11-29 05:33:13.674119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.518 #43 NEW cov: 11805 ft: 14519 corp: 24/275b lim: 40 exec/s: 43 rss: 69Mb L: 12/36 MS: 1 InsertByte- 00:08:02.518 [2024-11-29 05:33:13.724221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:89609300 cdw11:651ba731 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.518 [2024-11-29 05:33:13.724252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.518 #44 NEW cov: 11805 ft: 14521 corp: 25/286b lim: 40 exec/s: 44 rss: 69Mb L: 11/36 MS: 1 ChangeBit- 00:08:02.518 [2024-11-29 05:33:13.784372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00933160 cdw11:77e2a732 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.518 [2024-11-29 05:33:13.784402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.776 #45 NEW cov: 11805 ft: 14529 corp: 26/297b lim: 40 exec/s: 45 rss: 69Mb L: 11/36 MS: 1 ChangeBinInt- 00:08:02.776 [2024-11-29 05:33:13.844575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1b65a731 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.776 [2024-11-29 05:33:13.844613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.776 [2024-11-29 05:33:13.844662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c20a0a47 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.776 [2024-11-29 05:33:13.844677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.776 #46 NEW cov: 11805 ft: 14540 corp: 27/316b lim: 40 exec/s: 46 rss: 69Mb L: 19/36 MS: 1 CMP- DE: "G\000\000\000\000\000\000\000"- 00:08:02.776 [2024-11-29 05:33:13.894643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1e32a732 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.776 [2024-11-29 05:33:13.894672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.776 #47 NEW cov: 11805 ft: 14552 corp: 28/327b lim: 40 exec/s: 47 rss: 69Mb L: 11/36 MS: 1 ShuffleBytes- 00:08:02.776 [2024-11-29 05:33:13.954859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1e32a732 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.776 [2024-11-29 05:33:13.954888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.776 #48 NEW cov: 11805 ft: 14558 corp: 29/338b lim: 40 exec/s: 48 rss: 69Mb L: 11/36 MS: 1 ChangeBit- 00:08:02.776 [2024-11-29 05:33:14.004993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1e32a7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.776 [2024-11-29 05:33:14.005022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.776 [2024-11-29 05:33:14.005070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.776 [2024-11-29 05:33:14.005086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.776 #49 NEW cov: 11805 ft: 14575 corp: 30/359b lim: 40 exec/s: 49 rss: 69Mb L: 21/36 MS: 1 InsertRepeatedBytes- 00:08:02.776 [2024-11-29 05:33:14.065275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00936089 cdw11:1e32a700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.776 [2024-11-29 05:33:14.065308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.776 [2024-11-29 05:33:14.065357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.776 [2024-11-29 05:33:14.065372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.776 [2024-11-29 05:33:14.065402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.776 [2024-11-29 05:33:14.065417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.776 [2024-11-29 05:33:14.065447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.776 [2024-11-29 05:33:14.065462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.035 #50 NEW cov: 11812 ft: 14644 corp: 31/398b lim: 40 exec/s: 50 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:03.035 [2024-11-29 05:33:14.125300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00934189 cdw11:00936089 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.035 [2024-11-29 05:33:14.125329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.035 [2024-11-29 05:33:14.125377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:1e0a0ab5 cdw11:1e32a732 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.035 [2024-11-29 05:33:14.125393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.035 #51 NEW cov: 11812 ft: 14658 corp: 32/417b lim: 40 exec/s: 51 rss: 69Mb L: 19/39 MS: 1 CrossOver- 00:08:03.035 [2024-11-29 05:33:14.175408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0093a889 cdw11:1ea7f3c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.035 [2024-11-29 05:33:14.175438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.035 #52 NEW cov: 11812 ft: 14740 corp: 33/427b lim: 40 exec/s: 26 rss: 69Mb L: 10/39 MS: 1 CopyPart- 00:08:03.035 #52 DONE cov: 11812 ft: 14740 corp: 33/427b lim: 40 exec/s: 26 rss: 69Mb 00:08:03.035 ###### Recommended dictionary. ###### 00:08:03.035 "\000\223\257\211\036\247\363\302" # Uses: 0 00:08:03.035 "\001\000\000\001" # Uses: 0 00:08:03.035 "G\000\000\000\000\000\000\000" # Uses: 0 00:08:03.035 ###### End of recommended dictionary. ###### 00:08:03.035 Done 52 runs in 2 second(s) 00:08:03.035 05:33:14 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:08:03.035 05:33:14 -- ../common.sh@72 -- # (( i++ )) 00:08:03.035 05:33:14 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.035 05:33:14 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:03.035 05:33:14 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:03.035 05:33:14 -- nvmf/run.sh@24 -- # local timen=1 00:08:03.035 05:33:14 -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.035 05:33:14 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:03.035 05:33:14 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:03.035 05:33:14 -- nvmf/run.sh@29 -- # printf %02d 11 00:08:03.035 05:33:14 -- nvmf/run.sh@29 -- # port=4411 00:08:03.035 05:33:14 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:03.035 05:33:14 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:03.035 05:33:14 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.293 05:33:14 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:08:03.293 [2024-11-29 05:33:14.366955] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:03.293 [2024-11-29 05:33:14.367040] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2218429 ] 00:08:03.293 EAL: No free 2048 kB hugepages reported on node 1 00:08:03.552 [2024-11-29 05:33:14.617271] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.552 [2024-11-29 05:33:14.644124] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:03.552 [2024-11-29 05:33:14.644259] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.552 [2024-11-29 05:33:14.695530] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.552 [2024-11-29 05:33:14.711896] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:03.552 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.552 INFO: Seed: 924388309 00:08:03.552 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:03.552 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:03.552 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:03.552 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.552 #2 INITED exec/s: 0 rss: 59Mb 00:08:03.552 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.552 This may also happen if the target rejected all inputs we tried so far 00:08:03.552 [2024-11-29 05:33:14.761151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:0d0d0d0d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.552 [2024-11-29 05:33:14.761179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.552 [2024-11-29 05:33:14.761238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0d0d0d0d cdw11:0d0d0d0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.552 [2024-11-29 05:33:14.761253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.810 NEW_FUNC[1/671]: 0x45ffb8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:03.810 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.810 #13 NEW cov: 11596 ft: 11596 corp: 2/17b lim: 40 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:08:03.810 [2024-11-29 05:33:15.061834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.810 [2024-11-29 05:33:15.061866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.810 #15 NEW cov: 11709 ft: 12809 corp: 3/32b lim: 40 exec/s: 0 rss: 68Mb L: 15/16 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:03.810 [2024-11-29 05:33:15.102006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:0d0d0d0d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.810 [2024-11-29 05:33:15.102032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.810 [2024-11-29 05:33:15.102089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.810 [2024-11-29 05:33:15.102107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.069 #21 NEW cov: 11715 ft: 12996 corp: 4/48b lim: 40 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 CMP- DE: "\000\000\000\000\000\000\004\000"- 00:08:04.069 [2024-11-29 05:33:15.142291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.069 [2024-11-29 05:33:15.142317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.069 [2024-11-29 05:33:15.142394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.069 [2024-11-29 05:33:15.142409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.069 [2024-11-29 05:33:15.142467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.069 [2024-11-29 05:33:15.142481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.069 #22 NEW cov: 11800 ft: 13541 corp: 5/73b lim: 40 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:04.069 [2024-11-29 05:33:15.182023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.069 [2024-11-29 05:33:15.182049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.069 #23 NEW cov: 11800 ft: 13674 corp: 6/88b lim: 40 exec/s: 0 rss: 68Mb L: 15/25 MS: 1 CopyPart- 00:08:04.069 [2024-11-29 05:33:15.222368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:0d0d0d66 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.069 [2024-11-29 05:33:15.222394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.069 [2024-11-29 05:33:15.222467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66660d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.069 [2024-11-29 05:33:15.222481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.069 #24 NEW cov: 11800 ft: 13702 corp: 7/111b lim: 40 exec/s: 0 rss: 68Mb L: 23/25 MS: 1 InsertRepeatedBytes- 00:08:04.069 [2024-11-29 05:33:15.262975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.069 [2024-11-29 05:33:15.263000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.069 [2024-11-29 05:33:15.263076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.069 [2024-11-29 05:33:15.263090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.069 [2024-11-29 05:33:15.263147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.069 [2024-11-29 05:33:15.263161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.069 [2024-11-29 05:33:15.263220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.070 [2024-11-29 05:33:15.263233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.070 [2024-11-29 05:33:15.263290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff0a00ff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.070 [2024-11-29 05:33:15.263307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.070 #25 NEW cov: 11800 ft: 14164 corp: 8/151b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:04.070 [2024-11-29 05:33:15.313128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.070 [2024-11-29 05:33:15.313154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.070 [2024-11-29 05:33:15.313214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.070 [2024-11-29 05:33:15.313228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.070 [2024-11-29 05:33:15.313298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.070 [2024-11-29 05:33:15.313312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.070 [2024-11-29 05:33:15.313368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.070 [2024-11-29 05:33:15.313381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.070 [2024-11-29 05:33:15.313438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff0a00ff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.070 [2024-11-29 05:33:15.313451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.070 #26 NEW cov: 11800 ft: 14198 corp: 9/191b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:04.070 [2024-11-29 05:33:15.352746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:f6f2f299 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.070 [2024-11-29 05:33:15.352771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.070 [2024-11-29 05:33:15.352827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:66660d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.070 [2024-11-29 05:33:15.352841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.329 #27 NEW cov: 11800 ft: 14295 corp: 10/214b lim: 40 exec/s: 0 rss: 68Mb L: 23/40 MS: 1 ChangeBinInt- 00:08:04.329 [2024-11-29 05:33:15.392860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:f6f2f299 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.329 [2024-11-29 05:33:15.392885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.329 [2024-11-29 05:33:15.392945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:66660d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.329 [2024-11-29 05:33:15.392959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.329 #28 NEW cov: 11800 ft: 14365 corp: 11/237b lim: 40 exec/s: 0 rss: 68Mb L: 23/40 MS: 1 ChangeBit- 00:08:04.329 [2024-11-29 05:33:15.433025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:0d0d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.329 [2024-11-29 05:33:15.433049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.329 [2024-11-29 05:33:15.433113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:04000d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.329 [2024-11-29 05:33:15.433127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.329 #29 NEW cov: 11800 ft: 14491 corp: 12/260b lim: 40 exec/s: 0 rss: 68Mb L: 23/40 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\004\000"- 00:08:04.329 [2024-11-29 05:33:15.473512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.329 [2024-11-29 05:33:15.473537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.329 [2024-11-29 05:33:15.473625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.329 [2024-11-29 05:33:15.473640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.329 [2024-11-29 05:33:15.473699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.329 [2024-11-29 05:33:15.473713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.329 [2024-11-29 05:33:15.473772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.329 [2024-11-29 05:33:15.473786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.329 [2024-11-29 05:33:15.473855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff0a00ff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.329 [2024-11-29 05:33:15.473868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.329 #30 NEW cov: 11800 ft: 14522 corp: 13/300b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:04.329 [2024-11-29 05:33:15.513198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.329 [2024-11-29 05:33:15.513223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.329 [2024-11-29 05:33:15.513278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000400ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.329 [2024-11-29 05:33:15.513292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.329 #31 NEW cov: 11800 ft: 14561 corp: 14/323b lim: 40 exec/s: 0 rss: 69Mb L: 23/40 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\004\000"- 00:08:04.330 [2024-11-29 05:33:15.553281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.330 [2024-11-29 05:33:15.553306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.330 [2024-11-29 05:33:15.553363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0affffff cdw11:ff0a00ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.330 [2024-11-29 05:33:15.553377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.330 #32 NEW cov: 11800 ft: 14600 corp: 15/345b lim: 40 exec/s: 0 rss: 69Mb L: 22/40 MS: 1 CopyPart- 00:08:04.330 [2024-11-29 05:33:15.593906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.330 [2024-11-29 05:33:15.593935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.330 [2024-11-29 05:33:15.593998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffab cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.330 [2024-11-29 05:33:15.594013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.330 [2024-11-29 05:33:15.594073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.330 [2024-11-29 05:33:15.594087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.330 [2024-11-29 05:33:15.594142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.330 [2024-11-29 05:33:15.594156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.330 [2024-11-29 05:33:15.594216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff0a00ff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.330 [2024-11-29 05:33:15.594230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.330 #33 NEW cov: 11800 ft: 14611 corp: 16/385b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 ChangeByte- 00:08:04.589 [2024-11-29 05:33:15.633422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.633447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.590 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:04.590 #34 NEW cov: 11823 ft: 14637 corp: 17/393b lim: 40 exec/s: 0 rss: 69Mb L: 8/40 MS: 1 CrossOver- 00:08:04.590 [2024-11-29 05:33:15.673629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.673655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.673734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0a00ffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.673748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.590 #35 NEW cov: 11823 ft: 14707 corp: 18/416b lim: 40 exec/s: 0 rss: 69Mb L: 23/40 MS: 1 CrossOver- 00:08:04.590 [2024-11-29 05:33:15.713937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:0d0d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.713963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.714024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:04000d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.714038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.714097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00007200 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.714111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.590 #36 NEW cov: 11823 ft: 14717 corp: 19/440b lim: 40 exec/s: 0 rss: 69Mb L: 24/40 MS: 1 InsertByte- 00:08:04.590 [2024-11-29 05:33:15.754091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0000002d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.754116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.754178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.754191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.754249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.754261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.590 #37 NEW cov: 11823 ft: 14749 corp: 20/466b lim: 40 exec/s: 37 rss: 69Mb L: 26/40 MS: 1 InsertByte- 00:08:04.590 [2024-11-29 05:33:15.794441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.794466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.794525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.794539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.794602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.794615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.794672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.794685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.794744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff0a00ff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.794757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.590 #38 NEW cov: 11823 ft: 14767 corp: 21/506b lim: 40 exec/s: 38 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:08:04.590 [2024-11-29 05:33:15.834258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.834283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.834341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.834355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.834411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.834424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.590 #39 NEW cov: 11823 ft: 14788 corp: 22/531b lim: 40 exec/s: 39 rss: 69Mb L: 25/40 MS: 1 ShuffleBytes- 00:08:04.590 [2024-11-29 05:33:15.874387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.874413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.874486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.874500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.590 [2024-11-29 05:33:15.874554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.590 [2024-11-29 05:33:15.874567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.849 #40 NEW cov: 11823 ft: 14861 corp: 23/556b lim: 40 exec/s: 40 rss: 69Mb L: 25/40 MS: 1 CrossOver- 00:08:04.849 [2024-11-29 05:33:15.914175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.849 [2024-11-29 05:33:15.914200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.850 #41 NEW cov: 11823 ft: 14868 corp: 24/571b lim: 40 exec/s: 41 rss: 69Mb L: 15/40 MS: 1 ChangeBinInt- 00:08:04.850 [2024-11-29 05:33:15.954458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:f6b2f299 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:15.954484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.850 [2024-11-29 05:33:15.954540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:66660d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:15.954554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.850 #42 NEW cov: 11823 ft: 14888 corp: 25/594b lim: 40 exec/s: 42 rss: 69Mb L: 23/40 MS: 1 ChangeBit- 00:08:04.850 [2024-11-29 05:33:15.995083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:15.995108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.850 [2024-11-29 05:33:15.995164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:15.995178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.850 [2024-11-29 05:33:15.995234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:15.995248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.850 [2024-11-29 05:33:15.995304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:f7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:15.995317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.850 [2024-11-29 05:33:15.995373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff0a00ff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:15.995387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.850 #43 NEW cov: 11823 ft: 14942 corp: 26/634b lim: 40 exec/s: 43 rss: 69Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:04.850 [2024-11-29 05:33:16.035028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:16.035053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.850 [2024-11-29 05:33:16.035127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0affffff cdw11:ff0a00ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:16.035142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.850 [2024-11-29 05:33:16.035197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffff0a cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:16.035210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.850 [2024-11-29 05:33:16.035267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0a00ff00 cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:16.035280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.850 #44 NEW cov: 11823 ft: 14954 corp: 27/672b lim: 40 exec/s: 44 rss: 69Mb L: 38/40 MS: 1 CopyPart- 00:08:04.850 [2024-11-29 05:33:16.075160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:f6b2f299 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:16.075186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.850 [2024-11-29 05:33:16.075257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:66660d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:16.075273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.850 [2024-11-29 05:33:16.075332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0004ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:16.075346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.850 [2024-11-29 05:33:16.075403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:16.075418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.850 #45 NEW cov: 11823 ft: 14960 corp: 28/709b lim: 40 exec/s: 45 rss: 69Mb L: 37/40 MS: 1 InsertRepeatedBytes- 00:08:04.850 [2024-11-29 05:33:16.114812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d66 cdw11:660d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.850 [2024-11-29 05:33:16.114837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.850 #46 NEW cov: 11823 ft: 14961 corp: 29/723b lim: 40 exec/s: 46 rss: 69Mb L: 14/40 MS: 1 EraseBytes- 00:08:05.110 [2024-11-29 05:33:16.155063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:0d0d0d66 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.155088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.110 [2024-11-29 05:33:16.155148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66660d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.155165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.110 #47 NEW cov: 11823 ft: 14965 corp: 30/746b lim: 40 exec/s: 47 rss: 70Mb L: 23/40 MS: 1 ChangeBinInt- 00:08:05.110 [2024-11-29 05:33:16.195472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:0d0d0d0d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.195497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.110 [2024-11-29 05:33:16.195570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0dffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.195584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.110 [2024-11-29 05:33:16.195645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.195659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.110 [2024-11-29 05:33:16.195722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff0d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.195735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.110 #48 NEW cov: 11823 ft: 14982 corp: 31/784b lim: 40 exec/s: 48 rss: 70Mb L: 38/40 MS: 1 CrossOver- 00:08:05.110 [2024-11-29 05:33:16.235725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.235750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.110 [2024-11-29 05:33:16.235809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000400 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.235822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.110 [2024-11-29 05:33:16.235879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.235892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.110 [2024-11-29 05:33:16.235947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.235960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.110 [2024-11-29 05:33:16.236017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff0a00ff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.236030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.110 #49 NEW cov: 11823 ft: 14986 corp: 32/824b lim: 40 exec/s: 49 rss: 70Mb L: 40/40 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\004\000"- 00:08:05.110 [2024-11-29 05:33:16.275685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:f6b2f299 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.275711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.110 [2024-11-29 05:33:16.275786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:66660d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.110 [2024-11-29 05:33:16.275803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.111 [2024-11-29 05:33:16.275863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0004ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.275876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.111 [2024-11-29 05:33:16.275933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.275946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.111 #50 NEW cov: 11823 ft: 14996 corp: 33/861b lim: 40 exec/s: 50 rss: 70Mb L: 37/40 MS: 1 CopyPart- 00:08:05.111 [2024-11-29 05:33:16.315809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0000002d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.315834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.111 [2024-11-29 05:33:16.315893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.315906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.111 [2024-11-29 05:33:16.315963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.315976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.111 [2024-11-29 05:33:16.316030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.316043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.111 #51 NEW cov: 11823 ft: 15006 corp: 34/895b lim: 40 exec/s: 51 rss: 70Mb L: 34/40 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\004\000"- 00:08:05.111 [2024-11-29 05:33:16.356122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.356148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.111 [2024-11-29 05:33:16.356208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.356222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.111 [2024-11-29 05:33:16.356280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ff0a00ff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.356292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.111 [2024-11-29 05:33:16.356349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:f7ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.356362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.111 [2024-11-29 05:33:16.356419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff0a00ff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.356436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.111 #52 NEW cov: 11823 ft: 15019 corp: 35/935b lim: 40 exec/s: 52 rss: 70Mb L: 40/40 MS: 1 CrossOver- 00:08:05.111 [2024-11-29 05:33:16.395933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.395958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.111 [2024-11-29 05:33:16.396017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.396030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.111 [2024-11-29 05:33:16.396089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:00100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.111 [2024-11-29 05:33:16.396103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.371 #53 NEW cov: 11823 ft: 15024 corp: 36/964b lim: 40 exec/s: 53 rss: 70Mb L: 29/40 MS: 1 CMP- DE: "\001\000\000\020"- 00:08:05.371 [2024-11-29 05:33:16.435865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0d0d0d0d cdw11:0d0d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.435890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.371 [2024-11-29 05:33:16.435946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:04000d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.435960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.371 #54 NEW cov: 11823 ft: 15046 corp: 37/987b lim: 40 exec/s: 54 rss: 70Mb L: 23/40 MS: 1 ShuffleBytes- 00:08:05.371 [2024-11-29 05:33:16.475791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.475817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.371 #55 NEW cov: 11823 ft: 15075 corp: 38/1002b lim: 40 exec/s: 55 rss: 70Mb L: 15/40 MS: 1 ShuffleBytes- 00:08:05.371 [2024-11-29 05:33:16.516553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.516578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.371 [2024-11-29 05:33:16.516643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000400 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.516657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.371 [2024-11-29 05:33:16.516716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.516730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.371 [2024-11-29 05:33:16.516788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.516802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.371 [2024-11-29 05:33:16.516865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff0a00ff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.516879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.371 #56 NEW cov: 11823 ft: 15081 corp: 39/1042b lim: 40 exec/s: 56 rss: 70Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:05.371 [2024-11-29 05:33:16.556392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff3affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.556418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.371 [2024-11-29 05:33:16.556477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.556491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.371 [2024-11-29 05:33:16.556549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.556562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.371 #57 NEW cov: 11823 ft: 15090 corp: 40/1066b lim: 40 exec/s: 57 rss: 70Mb L: 24/40 MS: 1 InsertByte- 00:08:05.371 [2024-11-29 05:33:16.596506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff3affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.596531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.371 [2024-11-29 05:33:16.596593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.596611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.371 [2024-11-29 05:33:16.596671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffdfffff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.596684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.371 #58 NEW cov: 11823 ft: 15126 corp: 41/1090b lim: 40 exec/s: 58 rss: 70Mb L: 24/40 MS: 1 ChangeBit- 00:08:05.371 [2024-11-29 05:33:16.636469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ff010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.636494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.371 [2024-11-29 05:33:16.636552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.371 [2024-11-29 05:33:16.636565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.371 #59 NEW cov: 11823 ft: 15137 corp: 42/1106b lim: 40 exec/s: 59 rss: 70Mb L: 16/40 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:05.632 [2024-11-29 05:33:16.676421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05000000 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.632 [2024-11-29 05:33:16.676447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.632 #60 NEW cov: 11823 ft: 15163 corp: 43/1114b lim: 40 exec/s: 60 rss: 70Mb L: 8/40 MS: 1 ChangeBinInt- 00:08:05.632 [2024-11-29 05:33:16.717008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0000002d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.632 [2024-11-29 05:33:16.717038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.632 [2024-11-29 05:33:16.717098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.632 [2024-11-29 05:33:16.717111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.632 [2024-11-29 05:33:16.717181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2d000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.632 [2024-11-29 05:33:16.717195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.632 [2024-11-29 05:33:16.717252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.632 [2024-11-29 05:33:16.717265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.632 #61 NEW cov: 11823 ft: 15201 corp: 44/1151b lim: 40 exec/s: 61 rss: 70Mb L: 37/40 MS: 1 CopyPart- 00:08:05.632 [2024-11-29 05:33:16.757312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.632 [2024-11-29 05:33:16.757338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.632 [2024-11-29 05:33:16.757397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.632 [2024-11-29 05:33:16.757411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.632 [2024-11-29 05:33:16.757467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.632 [2024-11-29 05:33:16.757480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.632 [2024-11-29 05:33:16.757537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.632 [2024-11-29 05:33:16.757550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.632 [2024-11-29 05:33:16.757611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff0a00ff cdw11:ffff0a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.632 [2024-11-29 05:33:16.757624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.632 #67 NEW cov: 11823 ft: 15207 corp: 45/1191b lim: 40 exec/s: 33 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:08:05.632 #67 DONE cov: 11823 ft: 15207 corp: 45/1191b lim: 40 exec/s: 33 rss: 70Mb 00:08:05.632 ###### Recommended dictionary. ###### 00:08:05.632 "\000\000\000\000\000\000\004\000" # Uses: 4 00:08:05.632 "\001\000\000\020" # Uses: 1 00:08:05.632 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:05.632 ###### End of recommended dictionary. ###### 00:08:05.632 Done 67 runs in 2 second(s) 00:08:05.632 05:33:16 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:08:05.632 05:33:16 -- ../common.sh@72 -- # (( i++ )) 00:08:05.632 05:33:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.632 05:33:16 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:05.632 05:33:16 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:05.632 05:33:16 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.632 05:33:16 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.632 05:33:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:05.632 05:33:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:05.632 05:33:16 -- nvmf/run.sh@29 -- # printf %02d 12 00:08:05.632 05:33:16 -- nvmf/run.sh@29 -- # port=4412 00:08:05.632 05:33:16 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:05.632 05:33:16 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:05.632 05:33:16 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.632 05:33:16 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:08:05.892 [2024-11-29 05:33:16.934311] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:05.892 [2024-11-29 05:33:16.934383] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2218972 ] 00:08:05.892 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.892 [2024-11-29 05:33:17.182803] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.151 [2024-11-29 05:33:17.211051] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:06.151 [2024-11-29 05:33:17.211187] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.151 [2024-11-29 05:33:17.262372] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:06.151 [2024-11-29 05:33:17.278752] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:06.151 INFO: Running with entropic power schedule (0xFF, 100). 00:08:06.151 INFO: Seed: 3491386554 00:08:06.151 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:06.151 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:06.151 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:06.151 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.151 #2 INITED exec/s: 0 rss: 59Mb 00:08:06.151 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.151 This may also happen if the target rejected all inputs we tried so far 00:08:06.151 [2024-11-29 05:33:17.324111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.151 [2024-11-29 05:33:17.324138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.151 [2024-11-29 05:33:17.324194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.151 [2024-11-29 05:33:17.324208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.410 NEW_FUNC[1/671]: 0x461d28 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:06.410 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.410 #15 NEW cov: 11579 ft: 11595 corp: 2/24b lim: 40 exec/s: 0 rss: 67Mb L: 23/23 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:06.410 [2024-11-29 05:33:17.624620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.410 [2024-11-29 05:33:17.624656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.410 #18 NEW cov: 11707 ft: 12879 corp: 3/34b lim: 40 exec/s: 0 rss: 67Mb L: 10/23 MS: 3 ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:08:06.410 [2024-11-29 05:33:17.664855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.410 [2024-11-29 05:33:17.664882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.410 [2024-11-29 05:33:17.664935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.410 [2024-11-29 05:33:17.664950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.410 #20 NEW cov: 11713 ft: 13078 corp: 4/50b lim: 40 exec/s: 0 rss: 68Mb L: 16/23 MS: 2 EraseBytes-CrossOver- 00:08:06.410 [2024-11-29 05:33:17.704965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.410 [2024-11-29 05:33:17.704991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.410 [2024-11-29 05:33:17.705046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.410 [2024-11-29 05:33:17.705060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.670 #21 NEW cov: 11798 ft: 13316 corp: 5/66b lim: 40 exec/s: 0 rss: 68Mb L: 16/23 MS: 1 CopyPart- 00:08:06.670 [2024-11-29 05:33:17.745056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ec393939 cdw11:39003939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.745082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.670 [2024-11-29 05:33:17.745138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39003939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.745152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.670 #26 NEW cov: 11798 ft: 13447 corp: 6/82b lim: 40 exec/s: 0 rss: 68Mb L: 16/23 MS: 5 EraseBytes-ChangeBinInt-ChangeByte-EraseBytes-CrossOver- 00:08:06.670 [2024-11-29 05:33:17.785163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.785189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.670 [2024-11-29 05:33:17.785245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.785259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.670 #27 NEW cov: 11798 ft: 13575 corp: 7/100b lim: 40 exec/s: 0 rss: 68Mb L: 18/23 MS: 1 CopyPart- 00:08:06.670 [2024-11-29 05:33:17.825258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.825284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.670 [2024-11-29 05:33:17.825338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.825352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.670 #28 NEW cov: 11798 ft: 13637 corp: 8/116b lim: 40 exec/s: 0 rss: 68Mb L: 16/23 MS: 1 CrossOver- 00:08:06.670 [2024-11-29 05:33:17.865559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.865588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.670 [2024-11-29 05:33:17.865649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:0000000b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.865664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.670 [2024-11-29 05:33:17.865717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.865731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.670 #29 NEW cov: 11798 ft: 13915 corp: 9/142b lim: 40 exec/s: 0 rss: 68Mb L: 26/26 MS: 1 CopyPart- 00:08:06.670 [2024-11-29 05:33:17.905865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.905891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.670 [2024-11-29 05:33:17.905947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.905961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.670 [2024-11-29 05:33:17.906012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:39adadad cdw11:adadadad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.906041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.670 [2024-11-29 05:33:17.906094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adadadad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.906107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.670 #30 NEW cov: 11798 ft: 14241 corp: 10/180b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:06.670 [2024-11-29 05:33:17.945478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.670 [2024-11-29 05:33:17.945504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.670 #31 NEW cov: 11798 ft: 14277 corp: 11/190b lim: 40 exec/s: 0 rss: 68Mb L: 10/38 MS: 1 ChangeBit- 00:08:06.930 [2024-11-29 05:33:17.985757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.930 [2024-11-29 05:33:17.985782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.930 [2024-11-29 05:33:17.985838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.930 [2024-11-29 05:33:17.985852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.930 #32 NEW cov: 11798 ft: 14399 corp: 12/206b lim: 40 exec/s: 0 rss: 68Mb L: 16/38 MS: 1 CopyPart- 00:08:06.930 [2024-11-29 05:33:18.025889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8080ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.930 [2024-11-29 05:33:18.025914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.930 [2024-11-29 05:33:18.025972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.930 [2024-11-29 05:33:18.025986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.931 #37 NEW cov: 11798 ft: 14414 corp: 13/222b lim: 40 exec/s: 0 rss: 68Mb L: 16/38 MS: 5 ChangeBinInt-ChangeBit-ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:06.931 [2024-11-29 05:33:18.055795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00300000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.055820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.931 #38 NEW cov: 11798 ft: 14478 corp: 14/232b lim: 40 exec/s: 0 rss: 68Mb L: 10/38 MS: 1 ChangeByte- 00:08:06.931 [2024-11-29 05:33:18.096230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.096255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.931 [2024-11-29 05:33:18.096312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:0000000b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.096325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.931 [2024-11-29 05:33:18.096381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000035 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.096394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.931 #39 NEW cov: 11798 ft: 14512 corp: 15/258b lim: 40 exec/s: 0 rss: 68Mb L: 26/38 MS: 1 ChangeByte- 00:08:06.931 [2024-11-29 05:33:18.136223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.136250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.931 [2024-11-29 05:33:18.136307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.136320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.931 #40 NEW cov: 11798 ft: 14540 corp: 16/274b lim: 40 exec/s: 0 rss: 68Mb L: 16/38 MS: 1 ShuffleBytes- 00:08:06.931 [2024-11-29 05:33:18.176329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.176354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.931 [2024-11-29 05:33:18.176410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.176423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.931 #41 NEW cov: 11798 ft: 14557 corp: 17/290b lim: 40 exec/s: 0 rss: 68Mb L: 16/38 MS: 1 ShuffleBytes- 00:08:06.931 [2024-11-29 05:33:18.216839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.216864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.931 [2024-11-29 05:33:18.216920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.216936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.931 [2024-11-29 05:33:18.216991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.217004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.931 [2024-11-29 05:33:18.217060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.931 [2024-11-29 05:33:18.217073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.191 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:07.191 #42 NEW cov: 11821 ft: 14603 corp: 18/327b lim: 40 exec/s: 0 rss: 68Mb L: 37/38 MS: 1 CopyPart- 00:08:07.191 [2024-11-29 05:33:18.266666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.266699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.191 [2024-11-29 05:33:18.266754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.266768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.191 #43 NEW cov: 11821 ft: 14608 corp: 19/343b lim: 40 exec/s: 0 rss: 69Mb L: 16/38 MS: 1 CopyPart- 00:08:07.191 [2024-11-29 05:33:18.306726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.306751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.191 [2024-11-29 05:33:18.306809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00002400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.306823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.191 #44 NEW cov: 11821 ft: 14616 corp: 20/359b lim: 40 exec/s: 44 rss: 69Mb L: 16/38 MS: 1 ChangeByte- 00:08:07.191 [2024-11-29 05:33:18.346845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8080ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.346871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.191 [2024-11-29 05:33:18.346925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff80ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.346939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.191 #45 NEW cov: 11821 ft: 14703 corp: 21/375b lim: 40 exec/s: 45 rss: 69Mb L: 16/38 MS: 1 CopyPart- 00:08:07.191 [2024-11-29 05:33:18.386921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8083ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.386947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.191 [2024-11-29 05:33:18.387005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.387022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.191 #46 NEW cov: 11821 ft: 14737 corp: 22/391b lim: 40 exec/s: 46 rss: 69Mb L: 16/38 MS: 1 ChangeBinInt- 00:08:07.191 [2024-11-29 05:33:18.427034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.427059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.191 [2024-11-29 05:33:18.427116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00002400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.427130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.191 #47 NEW cov: 11821 ft: 14760 corp: 23/407b lim: 40 exec/s: 47 rss: 69Mb L: 16/38 MS: 1 ShuffleBytes- 00:08:07.191 [2024-11-29 05:33:18.467503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.467528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.191 [2024-11-29 05:33:18.467605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.467620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.191 [2024-11-29 05:33:18.467674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.467688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.191 [2024-11-29 05:33:18.467744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.191 [2024-11-29 05:33:18.467757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.454 #48 NEW cov: 11821 ft: 14770 corp: 24/444b lim: 40 exec/s: 48 rss: 69Mb L: 37/38 MS: 1 ShuffleBytes- 00:08:07.454 [2024-11-29 05:33:18.507282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000050 cdw11:2be8448c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.454 [2024-11-29 05:33:18.507307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.454 [2024-11-29 05:33:18.507364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:af930000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.454 [2024-11-29 05:33:18.507377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.454 #49 NEW cov: 11821 ft: 14793 corp: 25/460b lim: 40 exec/s: 49 rss: 69Mb L: 16/38 MS: 1 CMP- DE: "P+\350D\214\257\223\000"- 00:08:07.454 [2024-11-29 05:33:18.537674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.454 [2024-11-29 05:33:18.537698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.454 [2024-11-29 05:33:18.537754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.454 [2024-11-29 05:33:18.537768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.454 [2024-11-29 05:33:18.537827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:39adadad cdw11:adadadad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.454 [2024-11-29 05:33:18.537841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.454 [2024-11-29 05:33:18.537894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adadadad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.454 [2024-11-29 05:33:18.537907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.454 #50 NEW cov: 11821 ft: 14804 corp: 26/498b lim: 40 exec/s: 50 rss: 69Mb L: 38/38 MS: 1 ShuffleBytes- 00:08:07.454 [2024-11-29 05:33:18.577334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8080ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.454 [2024-11-29 05:33:18.577358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.454 #51 NEW cov: 11821 ft: 14818 corp: 27/510b lim: 40 exec/s: 51 rss: 69Mb L: 12/38 MS: 1 EraseBytes- 00:08:07.454 [2024-11-29 05:33:18.617768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.454 [2024-11-29 05:33:18.617793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.454 [2024-11-29 05:33:18.617850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000a7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.454 [2024-11-29 05:33:18.617864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.454 [2024-11-29 05:33:18.617921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:a7a7a7a7 cdw11:a7a7a7a7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.454 [2024-11-29 05:33:18.617934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.454 #52 NEW cov: 11821 ft: 14831 corp: 28/539b lim: 40 exec/s: 52 rss: 69Mb L: 29/38 MS: 1 InsertRepeatedBytes- 00:08:07.455 [2024-11-29 05:33:18.657699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.455 [2024-11-29 05:33:18.657723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.455 [2024-11-29 05:33:18.657779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.455 [2024-11-29 05:33:18.657792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.455 #53 NEW cov: 11821 ft: 14840 corp: 29/557b lim: 40 exec/s: 53 rss: 69Mb L: 18/38 MS: 1 CrossOver- 00:08:07.455 [2024-11-29 05:33:18.698014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.455 [2024-11-29 05:33:18.698039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.455 [2024-11-29 05:33:18.698094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:0000000b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.455 [2024-11-29 05:33:18.698108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.455 [2024-11-29 05:33:18.698160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.455 [2024-11-29 05:33:18.698189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.455 #54 NEW cov: 11821 ft: 14864 corp: 30/583b lim: 40 exec/s: 54 rss: 69Mb L: 26/38 MS: 1 ChangeASCIIInt- 00:08:07.455 [2024-11-29 05:33:18.738273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.455 [2024-11-29 05:33:18.738298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.455 [2024-11-29 05:33:18.738368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.455 [2024-11-29 05:33:18.738382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.455 [2024-11-29 05:33:18.738435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.455 [2024-11-29 05:33:18.738448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.455 [2024-11-29 05:33:18.738501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.455 [2024-11-29 05:33:18.738514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.715 #55 NEW cov: 11821 ft: 14913 corp: 31/620b lim: 40 exec/s: 55 rss: 69Mb L: 37/38 MS: 1 ChangeBit- 00:08:07.715 [2024-11-29 05:33:18.777952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8080ffff cdw11:ffff3aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.777978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.715 #56 NEW cov: 11821 ft: 14915 corp: 32/633b lim: 40 exec/s: 56 rss: 69Mb L: 13/38 MS: 1 InsertByte- 00:08:07.715 [2024-11-29 05:33:18.818527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.818552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.715 [2024-11-29 05:33:18.818611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:0000000b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.818625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.715 [2024-11-29 05:33:18.818678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.818691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.715 [2024-11-29 05:33:18.818759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:502be844 cdw11:8caf9300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.818773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.715 #57 NEW cov: 11821 ft: 14943 corp: 33/667b lim: 40 exec/s: 57 rss: 69Mb L: 34/38 MS: 1 PersAutoDict- DE: "P+\350D\214\257\223\000"- 00:08:07.715 [2024-11-29 05:33:18.858479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.858504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.715 [2024-11-29 05:33:18.858562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:0400000b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.858579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.715 [2024-11-29 05:33:18.858635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.858649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.715 #58 NEW cov: 11821 ft: 14952 corp: 34/693b lim: 40 exec/s: 58 rss: 69Mb L: 26/38 MS: 1 ChangeBinInt- 00:08:07.715 [2024-11-29 05:33:18.898413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.898438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.715 [2024-11-29 05:33:18.898491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:502be844 cdw11:8caf9300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.898504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.715 #59 NEW cov: 11821 ft: 14968 corp: 35/716b lim: 40 exec/s: 59 rss: 69Mb L: 23/38 MS: 1 PersAutoDict- DE: "P+\350D\214\257\223\000"- 00:08:07.715 [2024-11-29 05:33:18.938388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b200000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.938413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.715 #60 NEW cov: 11821 ft: 15029 corp: 36/726b lim: 40 exec/s: 60 rss: 69Mb L: 10/38 MS: 1 ChangeBit- 00:08:07.715 [2024-11-29 05:33:18.978809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00260000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.978833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.715 [2024-11-29 05:33:18.978890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:0000000b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.978903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.715 [2024-11-29 05:33:18.978976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.715 [2024-11-29 05:33:18.978990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.715 #61 NEW cov: 11821 ft: 15075 corp: 37/752b lim: 40 exec/s: 61 rss: 70Mb L: 26/38 MS: 1 ChangeByte- 00:08:07.975 [2024-11-29 05:33:19.019135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39373939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.019160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.975 [2024-11-29 05:33:19.019214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.019228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.975 [2024-11-29 05:33:19.019282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:39adadad cdw11:adadadad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.019296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.975 [2024-11-29 05:33:19.019354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adadadad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.019367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.975 #62 NEW cov: 11821 ft: 15092 corp: 38/790b lim: 40 exec/s: 62 rss: 70Mb L: 38/38 MS: 1 ChangeBinInt- 00:08:07.975 [2024-11-29 05:33:19.058758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e8448caf cdw11:93003939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.058783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.975 #63 NEW cov: 11821 ft: 15102 corp: 39/803b lim: 40 exec/s: 63 rss: 70Mb L: 13/38 MS: 1 EraseBytes- 00:08:07.975 [2024-11-29 05:33:19.099336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.099361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.975 [2024-11-29 05:33:19.099433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39c0c639 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.099447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.975 [2024-11-29 05:33:19.099501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:39adadad cdw11:adadadad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.099515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.975 [2024-11-29 05:33:19.099566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adadadad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.099579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.975 #64 NEW cov: 11821 ft: 15132 corp: 40/841b lim: 40 exec/s: 64 rss: 70Mb L: 38/38 MS: 1 ChangeBinInt- 00:08:07.975 [2024-11-29 05:33:19.139438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.139464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.975 [2024-11-29 05:33:19.139518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.139531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.975 [2024-11-29 05:33:19.139583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:39adadad cdw11:adadadad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.139601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.975 [2024-11-29 05:33:19.139658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:adadadad cdw11:adadadad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.139671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.975 #65 NEW cov: 11821 ft: 15141 corp: 41/879b lim: 40 exec/s: 65 rss: 70Mb L: 38/38 MS: 1 CrossOver- 00:08:07.975 [2024-11-29 05:33:19.179243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.179272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.975 [2024-11-29 05:33:19.179326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.179340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.975 #66 NEW cov: 11821 ft: 15197 corp: 42/895b lim: 40 exec/s: 66 rss: 70Mb L: 16/38 MS: 1 ShuffleBytes- 00:08:07.975 [2024-11-29 05:33:19.219394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00260000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.219420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.975 [2024-11-29 05:33:19.219477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:0000000b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.219491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.975 #67 NEW cov: 11821 ft: 15274 corp: 43/914b lim: 40 exec/s: 67 rss: 70Mb L: 19/38 MS: 1 EraseBytes- 00:08:07.975 [2024-11-29 05:33:19.259298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.975 [2024-11-29 05:33:19.259323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.235 #68 NEW cov: 11821 ft: 15288 corp: 44/923b lim: 40 exec/s: 68 rss: 70Mb L: 9/38 MS: 1 CrossOver- 00:08:08.235 [2024-11-29 05:33:19.299574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.235 [2024-11-29 05:33:19.299604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.235 [2024-11-29 05:33:19.299663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.235 [2024-11-29 05:33:19.299677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.235 [2024-11-29 05:33:19.339746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:000b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.235 [2024-11-29 05:33:19.339771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.235 [2024-11-29 05:33:19.339826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:10000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.235 [2024-11-29 05:33:19.339840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.235 #70 NEW cov: 11821 ft: 15325 corp: 45/939b lim: 40 exec/s: 35 rss: 70Mb L: 16/38 MS: 2 ChangeBinInt-ChangeBinInt- 00:08:08.235 #70 DONE cov: 11821 ft: 15325 corp: 45/939b lim: 40 exec/s: 35 rss: 70Mb 00:08:08.235 ###### Recommended dictionary. ###### 00:08:08.235 "P+\350D\214\257\223\000" # Uses: 2 00:08:08.235 ###### End of recommended dictionary. ###### 00:08:08.235 Done 70 runs in 2 second(s) 00:08:08.235 05:33:19 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:08.235 05:33:19 -- ../common.sh@72 -- # (( i++ )) 00:08:08.235 05:33:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.235 05:33:19 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:08.235 05:33:19 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:08.235 05:33:19 -- nvmf/run.sh@24 -- # local timen=1 00:08:08.235 05:33:19 -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.235 05:33:19 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:08.235 05:33:19 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:08.235 05:33:19 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:08.235 05:33:19 -- nvmf/run.sh@29 -- # port=4413 00:08:08.235 05:33:19 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:08.235 05:33:19 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:08.235 05:33:19 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.235 05:33:19 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:08.235 [2024-11-29 05:33:19.516288] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:08.235 [2024-11-29 05:33:19.516360] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2219423 ] 00:08:08.494 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.494 [2024-11-29 05:33:19.762150] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.494 [2024-11-29 05:33:19.791196] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:08.494 [2024-11-29 05:33:19.791319] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.754 [2024-11-29 05:33:19.842860] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.754 [2024-11-29 05:33:19.859206] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:08.754 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.754 INFO: Seed: 1777421148 00:08:08.754 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:08.754 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:08.754 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:08.754 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.754 #2 INITED exec/s: 0 rss: 59Mb 00:08:08.754 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.754 This may also happen if the target rejected all inputs we tried so far 00:08:08.754 [2024-11-29 05:33:19.904541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.754 [2024-11-29 05:33:19.904570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.754 [2024-11-29 05:33:19.904634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.754 [2024-11-29 05:33:19.904650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.014 NEW_FUNC[1/670]: 0x4638f8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:09.014 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:09.014 #19 NEW cov: 11582 ft: 11573 corp: 2/21b lim: 40 exec/s: 0 rss: 67Mb L: 20/20 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:09.014 [2024-11-29 05:33:20.205234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.014 [2024-11-29 05:33:20.205274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.014 [2024-11-29 05:33:20.205329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89208989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.014 [2024-11-29 05:33:20.205347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.014 #20 NEW cov: 11695 ft: 11883 corp: 3/42b lim: 40 exec/s: 0 rss: 68Mb L: 21/21 MS: 1 InsertByte- 00:08:09.014 [2024-11-29 05:33:20.255569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.014 [2024-11-29 05:33:20.255601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.014 [2024-11-29 05:33:20.255660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:898989ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.014 [2024-11-29 05:33:20.255674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.014 [2024-11-29 05:33:20.255732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.014 [2024-11-29 05:33:20.255745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.014 [2024-11-29 05:33:20.255803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.014 [2024-11-29 05:33:20.255817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.014 #21 NEW cov: 11701 ft: 12633 corp: 4/80b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:09.014 [2024-11-29 05:33:20.295504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.014 [2024-11-29 05:33:20.295529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.014 [2024-11-29 05:33:20.295588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.014 [2024-11-29 05:33:20.295607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.014 [2024-11-29 05:33:20.295643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.014 [2024-11-29 05:33:20.295657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.274 #24 NEW cov: 11786 ft: 13161 corp: 5/107b lim: 40 exec/s: 0 rss: 68Mb L: 27/38 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:09.274 [2024-11-29 05:33:20.335724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.274 [2024-11-29 05:33:20.335750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.274 [2024-11-29 05:33:20.335805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.274 [2024-11-29 05:33:20.335819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.274 [2024-11-29 05:33:20.335875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.274 [2024-11-29 05:33:20.335888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.274 [2024-11-29 05:33:20.335946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffff89 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.274 [2024-11-29 05:33:20.335960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.274 #30 NEW cov: 11786 ft: 13268 corp: 6/139b lim: 40 exec/s: 0 rss: 68Mb L: 32/38 MS: 1 EraseBytes- 00:08:09.274 [2024-11-29 05:33:20.375617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.274 [2024-11-29 05:33:20.375643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.274 [2024-11-29 05:33:20.375698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89208989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.274 [2024-11-29 05:33:20.375713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.274 #31 NEW cov: 11786 ft: 13412 corp: 7/160b lim: 40 exec/s: 0 rss: 68Mb L: 21/38 MS: 1 ShuffleBytes- 00:08:09.274 [2024-11-29 05:33:20.415604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.274 [2024-11-29 05:33:20.415630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.274 #35 NEW cov: 11786 ft: 13818 corp: 8/175b lim: 40 exec/s: 0 rss: 68Mb L: 15/38 MS: 4 ChangeBit-ShuffleBytes-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:09.274 [2024-11-29 05:33:20.455722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.274 [2024-11-29 05:33:20.455748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.274 #36 NEW cov: 11786 ft: 13975 corp: 9/189b lim: 40 exec/s: 0 rss: 68Mb L: 14/38 MS: 1 EraseBytes- 00:08:09.275 [2024-11-29 05:33:20.496061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.275 [2024-11-29 05:33:20.496086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.275 [2024-11-29 05:33:20.496142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.275 [2024-11-29 05:33:20.496156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.275 [2024-11-29 05:33:20.496212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.275 [2024-11-29 05:33:20.496225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.275 #37 NEW cov: 11786 ft: 14026 corp: 10/216b lim: 40 exec/s: 0 rss: 68Mb L: 27/38 MS: 1 ShuffleBytes- 00:08:09.275 [2024-11-29 05:33:20.536355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.275 [2024-11-29 05:33:20.536381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.275 [2024-11-29 05:33:20.536436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.275 [2024-11-29 05:33:20.536450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.275 [2024-11-29 05:33:20.536508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.275 [2024-11-29 05:33:20.536522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.275 [2024-11-29 05:33:20.536575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.275 [2024-11-29 05:33:20.536589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.275 #38 NEW cov: 11786 ft: 14051 corp: 11/254b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 CopyPart- 00:08:09.535 [2024-11-29 05:33:20.576605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:40404040 cdw11:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.535 [2024-11-29 05:33:20.576631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.535 [2024-11-29 05:33:20.576688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:40404040 cdw11:4025fdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.535 [2024-11-29 05:33:20.576701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.535 [2024-11-29 05:33:20.576757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.535 [2024-11-29 05:33:20.576771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.535 [2024-11-29 05:33:20.576803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.535 [2024-11-29 05:33:20.576816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.535 [2024-11-29 05:33:20.576867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.535 [2024-11-29 05:33:20.576880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.535 #39 NEW cov: 11786 ft: 14138 corp: 12/294b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:09.535 [2024-11-29 05:33:20.616455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.616480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.536 [2024-11-29 05:33:20.616538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.616551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.536 [2024-11-29 05:33:20.616610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.616624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.536 #40 NEW cov: 11786 ft: 14153 corp: 13/321b lim: 40 exec/s: 0 rss: 68Mb L: 27/40 MS: 1 ShuffleBytes- 00:08:09.536 [2024-11-29 05:33:20.656449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.656475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.536 [2024-11-29 05:33:20.656533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89248989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.656547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.536 #41 NEW cov: 11786 ft: 14215 corp: 14/342b lim: 40 exec/s: 0 rss: 68Mb L: 21/40 MS: 1 ChangeBit- 00:08:09.536 [2024-11-29 05:33:20.696557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.696583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.536 [2024-11-29 05:33:20.696647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89898889 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.696661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.536 #42 NEW cov: 11786 ft: 14263 corp: 15/362b lim: 40 exec/s: 0 rss: 68Mb L: 20/40 MS: 1 ChangeBit- 00:08:09.536 [2024-11-29 05:33:20.736805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.736831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.536 [2024-11-29 05:33:20.736887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.736901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.536 [2024-11-29 05:33:20.736958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.736971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.536 #43 NEW cov: 11786 ft: 14308 corp: 16/389b lim: 40 exec/s: 0 rss: 68Mb L: 27/40 MS: 1 CopyPart- 00:08:09.536 [2024-11-29 05:33:20.776903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.776929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.536 [2024-11-29 05:33:20.777001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8989ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.777015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.536 [2024-11-29 05:33:20.777072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff89 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.777085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.536 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:09.536 #44 NEW cov: 11809 ft: 14337 corp: 17/417b lim: 40 exec/s: 0 rss: 68Mb L: 28/40 MS: 1 EraseBytes- 00:08:09.536 [2024-11-29 05:33:20.816884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.816910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.536 [2024-11-29 05:33:20.816969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89248989 cdw11:892d8989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.536 [2024-11-29 05:33:20.816982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.796 #45 NEW cov: 11809 ft: 14364 corp: 18/439b lim: 40 exec/s: 0 rss: 68Mb L: 22/40 MS: 1 InsertByte- 00:08:09.796 [2024-11-29 05:33:20.856981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:20.857008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.796 [2024-11-29 05:33:20.857064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89890889 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:20.857078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.796 #46 NEW cov: 11809 ft: 14371 corp: 19/459b lim: 40 exec/s: 0 rss: 69Mb L: 20/40 MS: 1 ChangeBit- 00:08:09.796 [2024-11-29 05:33:20.897103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:129d9d9d cdw11:9d9d9d9d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:20.897128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.796 [2024-11-29 05:33:20.897182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:9d9d9d9d cdw11:9d9d9d9d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:20.897196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.796 #49 NEW cov: 11809 ft: 14402 corp: 20/479b lim: 40 exec/s: 49 rss: 69Mb L: 20/40 MS: 3 ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:08:09.796 [2024-11-29 05:33:20.927311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:20.927336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.796 [2024-11-29 05:33:20.927392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:20.927406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.796 [2024-11-29 05:33:20.927462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:20.927475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.796 #50 NEW cov: 11809 ft: 14420 corp: 21/505b lim: 40 exec/s: 50 rss: 69Mb L: 26/40 MS: 1 EraseBytes- 00:08:09.796 [2024-11-29 05:33:20.967410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25fdfdfd cdw11:fdfd3dfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:20.967436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.796 [2024-11-29 05:33:20.967510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:20.967524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.796 [2024-11-29 05:33:20.967580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:20.967601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.796 #51 NEW cov: 11809 ft: 14435 corp: 22/532b lim: 40 exec/s: 51 rss: 69Mb L: 27/40 MS: 1 ChangeByte- 00:08:09.796 [2024-11-29 05:33:21.007787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:40404040 cdw11:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:21.007813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.796 [2024-11-29 05:33:21.007869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:40404040 cdw11:40258920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:21.007882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.796 [2024-11-29 05:33:21.007937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:21.007967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.796 [2024-11-29 05:33:21.008021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:89fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.796 [2024-11-29 05:33:21.008035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.796 [2024-11-29 05:33:21.008089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.797 [2024-11-29 05:33:21.008103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.797 #52 NEW cov: 11809 ft: 14470 corp: 23/572b lim: 40 exec/s: 52 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:08:09.797 [2024-11-29 05:33:21.047704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.797 [2024-11-29 05:33:21.047729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.797 [2024-11-29 05:33:21.047787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.797 [2024-11-29 05:33:21.047800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.797 [2024-11-29 05:33:21.047856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.797 [2024-11-29 05:33:21.047869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.797 #53 NEW cov: 11809 ft: 14513 corp: 24/600b lim: 40 exec/s: 53 rss: 69Mb L: 28/40 MS: 1 InsertByte- 00:08:09.797 [2024-11-29 05:33:21.087689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:129d9d9d cdw11:9d9d9d9d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.797 [2024-11-29 05:33:21.087714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.797 [2024-11-29 05:33:21.087772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:e79d9d9d cdw11:9d9d9d9d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.797 [2024-11-29 05:33:21.087785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.057 #54 NEW cov: 11809 ft: 14593 corp: 25/621b lim: 40 exec/s: 54 rss: 69Mb L: 21/40 MS: 1 InsertByte- 00:08:10.057 [2024-11-29 05:33:21.128038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.057 [2024-11-29 05:33:21.128063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.057 [2024-11-29 05:33:21.128135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.057 [2024-11-29 05:33:21.128149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.057 [2024-11-29 05:33:21.128205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffff95ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.057 [2024-11-29 05:33:21.128218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.057 [2024-11-29 05:33:21.128273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.057 [2024-11-29 05:33:21.128286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.057 #55 NEW cov: 11809 ft: 14619 corp: 26/654b lim: 40 exec/s: 55 rss: 69Mb L: 33/40 MS: 1 InsertByte- 00:08:10.057 [2024-11-29 05:33:21.168002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.057 [2024-11-29 05:33:21.168027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.057 [2024-11-29 05:33:21.168084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89898889 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.057 [2024-11-29 05:33:21.168099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.057 [2024-11-29 05:33:21.168153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:89898889 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.057 [2024-11-29 05:33:21.168167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.057 #61 NEW cov: 11809 ft: 14656 corp: 27/685b lim: 40 exec/s: 61 rss: 69Mb L: 31/40 MS: 1 CopyPart- 00:08:10.057 [2024-11-29 05:33:21.207972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:fb898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.057 [2024-11-29 05:33:21.207997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.057 [2024-11-29 05:33:21.208068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89898988 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.057 [2024-11-29 05:33:21.208082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.057 #62 NEW cov: 11809 ft: 14669 corp: 28/706b lim: 40 exec/s: 62 rss: 69Mb L: 21/40 MS: 1 InsertByte- 00:08:10.057 [2024-11-29 05:33:21.238070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.057 [2024-11-29 05:33:21.238094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.058 [2024-11-29 05:33:21.238150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89208989 cdw11:8c898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.058 [2024-11-29 05:33:21.238167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.058 #63 NEW cov: 11809 ft: 14672 corp: 29/727b lim: 40 exec/s: 63 rss: 69Mb L: 21/40 MS: 1 ChangeBinInt- 00:08:10.058 [2024-11-29 05:33:21.268227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.058 [2024-11-29 05:33:21.268252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.058 [2024-11-29 05:33:21.268321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89248989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.058 [2024-11-29 05:33:21.268335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.058 #64 NEW cov: 11809 ft: 14683 corp: 30/748b lim: 40 exec/s: 64 rss: 69Mb L: 21/40 MS: 1 CopyPart- 00:08:10.058 [2024-11-29 05:33:21.308277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.058 [2024-11-29 05:33:21.308302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.058 [2024-11-29 05:33:21.308358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.058 [2024-11-29 05:33:21.308371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.058 #65 NEW cov: 11809 ft: 14748 corp: 31/769b lim: 40 exec/s: 65 rss: 69Mb L: 21/40 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:10.058 [2024-11-29 05:33:21.348417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89408989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.058 [2024-11-29 05:33:21.348442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.058 [2024-11-29 05:33:21.348500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89248989 cdw11:892d8989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.058 [2024-11-29 05:33:21.348513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.318 #66 NEW cov: 11809 ft: 14783 corp: 32/791b lim: 40 exec/s: 66 rss: 69Mb L: 22/40 MS: 1 ChangeByte- 00:08:10.318 [2024-11-29 05:33:21.388773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.388797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.388852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89200909 cdw11:09090909 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.388865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.388917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090909 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.388930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.388983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:09090989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.388996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.318 #67 NEW cov: 11809 ft: 14785 corp: 33/829b lim: 40 exec/s: 67 rss: 69Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:08:10.318 [2024-11-29 05:33:21.428883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.428908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.428975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:26200909 cdw11:09090909 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.428989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.429039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090909 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.429053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.429101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:09090989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.429115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.318 #68 NEW cov: 11809 ft: 14788 corp: 34/867b lim: 40 exec/s: 68 rss: 69Mb L: 38/40 MS: 1 ChangeBinInt- 00:08:10.318 [2024-11-29 05:33:21.469161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:40404040 cdw11:40404040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.469186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.469240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:40404040 cdw11:ea258920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.469253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.469305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.469318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.469355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:89fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.469368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.469420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.469433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.318 #69 NEW cov: 11809 ft: 14795 corp: 35/907b lim: 40 exec/s: 69 rss: 69Mb L: 40/40 MS: 1 ChangeByte- 00:08:10.318 [2024-11-29 05:33:21.508861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.508885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.508936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:24898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.508949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.318 #70 NEW cov: 11809 ft: 14809 corp: 36/928b lim: 40 exec/s: 70 rss: 69Mb L: 21/40 MS: 1 ShuffleBytes- 00:08:10.318 [2024-11-29 05:33:21.538961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898901 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.538986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.318 [2024-11-29 05:33:21.539056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000089 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.318 [2024-11-29 05:33:21.539070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.319 #71 NEW cov: 11809 ft: 14846 corp: 37/949b lim: 40 exec/s: 71 rss: 69Mb L: 21/40 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:10.319 [2024-11-29 05:33:21.579194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.319 [2024-11-29 05:33:21.579220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.319 [2024-11-29 05:33:21.579275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8920b6b6 cdw11:b6b6b6b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.319 [2024-11-29 05:33:21.579289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.319 [2024-11-29 05:33:21.579343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b6b6b6b6 cdw11:b6b6b6b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.319 [2024-11-29 05:33:21.579356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.319 #72 NEW cov: 11809 ft: 14853 corp: 38/980b lim: 40 exec/s: 72 rss: 70Mb L: 31/40 MS: 1 InsertRepeatedBytes- 00:08:10.319 [2024-11-29 05:33:21.619603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.319 [2024-11-29 05:33:21.619629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.319 [2024-11-29 05:33:21.619686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89898889 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.319 [2024-11-29 05:33:21.619700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.319 [2024-11-29 05:33:21.619753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:89898889 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.319 [2024-11-29 05:33:21.619767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.579 [2024-11-29 05:33:21.619822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:4e4e4e4e cdw11:4e4e4e4e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.619836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.579 [2024-11-29 05:33:21.619892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:4e898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.619905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.579 #73 NEW cov: 11809 ft: 14862 corp: 39/1020b lim: 40 exec/s: 73 rss: 70Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:10.579 [2024-11-29 05:33:21.659438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.659466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.579 [2024-11-29 05:33:21.659519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8920b6b6 cdw11:b6b6b6b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.659532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.579 [2024-11-29 05:33:21.659586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b6b6b6b6 cdw11:b6b6b6b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.659603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.579 #74 NEW cov: 11809 ft: 14867 corp: 40/1051b lim: 40 exec/s: 74 rss: 70Mb L: 31/40 MS: 1 CrossOver- 00:08:10.579 [2024-11-29 05:33:21.699696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.699721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.579 [2024-11-29 05:33:21.699798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.699812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.579 [2024-11-29 05:33:21.699865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898901 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.699878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.579 [2024-11-29 05:33:21.699926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000089 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.699939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.579 #75 NEW cov: 11809 ft: 14868 corp: 41/1084b lim: 40 exec/s: 75 rss: 70Mb L: 33/40 MS: 1 CopyPart- 00:08:10.579 [2024-11-29 05:33:21.739399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.739424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.579 #76 NEW cov: 11809 ft: 14893 corp: 42/1099b lim: 40 exec/s: 76 rss: 70Mb L: 15/40 MS: 1 CopyPart- 00:08:10.579 [2024-11-29 05:33:21.779920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.779947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.579 [2024-11-29 05:33:21.780003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:898989ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.780017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.579 [2024-11-29 05:33:21.780067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.780080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.579 [2024-11-29 05:33:21.780136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:95ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.579 [2024-11-29 05:33:21.780149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.579 #77 NEW cov: 11809 ft: 14901 corp: 43/1138b lim: 40 exec/s: 77 rss: 70Mb L: 39/40 MS: 1 CopyPart- 00:08:10.579 [2024-11-29 05:33:21.819911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.580 [2024-11-29 05:33:21.819936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.580 [2024-11-29 05:33:21.819993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8920b6b6 cdw11:bcb6b6b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.580 [2024-11-29 05:33:21.820006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.580 [2024-11-29 05:33:21.820062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b6b6b6b6 cdw11:b6b6b6b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.580 [2024-11-29 05:33:21.820091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.580 #78 NEW cov: 11809 ft: 14905 corp: 44/1169b lim: 40 exec/s: 78 rss: 70Mb L: 31/40 MS: 1 ChangeBinInt- 00:08:10.580 [2024-11-29 05:33:21.860021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25fdfddd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.580 [2024-11-29 05:33:21.860046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.580 [2024-11-29 05:33:21.860101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.580 [2024-11-29 05:33:21.860115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.580 [2024-11-29 05:33:21.860169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.580 [2024-11-29 05:33:21.860182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.580 #79 NEW cov: 11809 ft: 14912 corp: 45/1196b lim: 40 exec/s: 79 rss: 70Mb L: 27/40 MS: 1 ChangeBit- 00:08:10.840 [2024-11-29 05:33:21.900207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.840 [2024-11-29 05:33:21.900233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.840 [2024-11-29 05:33:21.900287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:89898989 cdw11:89898989 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.840 [2024-11-29 05:33:21.900300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.840 [2024-11-29 05:33:21.900350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.840 [2024-11-29 05:33:21.900364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.840 [2024-11-29 05:33:21.900413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.840 [2024-11-29 05:33:21.900426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.840 #80 NEW cov: 11809 ft: 14919 corp: 46/1234b lim: 40 exec/s: 40 rss: 70Mb L: 38/40 MS: 1 CopyPart- 00:08:10.840 #80 DONE cov: 11809 ft: 14919 corp: 46/1234b lim: 40 exec/s: 40 rss: 70Mb 00:08:10.840 ###### Recommended dictionary. ###### 00:08:10.840 "\001\000\000\000" # Uses: 0 00:08:10.840 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:10.840 ###### End of recommended dictionary. ###### 00:08:10.840 Done 80 runs in 2 second(s) 00:08:10.840 05:33:22 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:10.840 05:33:22 -- ../common.sh@72 -- # (( i++ )) 00:08:10.840 05:33:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.840 05:33:22 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:10.840 05:33:22 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:10.840 05:33:22 -- nvmf/run.sh@24 -- # local timen=1 00:08:10.840 05:33:22 -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.840 05:33:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:10.840 05:33:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:10.840 05:33:22 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:10.840 05:33:22 -- nvmf/run.sh@29 -- # port=4414 00:08:10.840 05:33:22 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:10.840 05:33:22 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:10.840 05:33:22 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.840 05:33:22 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:10.840 [2024-11-29 05:33:22.077781] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:10.840 [2024-11-29 05:33:22.077866] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2219809 ] 00:08:10.840 EAL: No free 2048 kB hugepages reported on node 1 00:08:11.099 [2024-11-29 05:33:22.331091] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.099 [2024-11-29 05:33:22.357994] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:11.099 [2024-11-29 05:33:22.358131] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.359 [2024-11-29 05:33:22.409365] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:11.359 [2024-11-29 05:33:22.425725] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:11.359 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.359 INFO: Seed: 50449168 00:08:11.359 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:11.359 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:11.359 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:11.359 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.359 #2 INITED exec/s: 0 rss: 59Mb 00:08:11.359 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:11.359 This may also happen if the target rejected all inputs we tried so far 00:08:11.359 [2024-11-29 05:33:22.503614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.359 [2024-11-29 05:33:22.503656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.359 [2024-11-29 05:33:22.503754] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.359 [2024-11-29 05:33:22.503771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.359 [2024-11-29 05:33:22.503866] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.359 [2024-11-29 05:33:22.503884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.649 NEW_FUNC[1/671]: 0x4654c8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:11.649 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:11.649 #16 NEW cov: 11576 ft: 11577 corp: 2/25b lim: 35 exec/s: 0 rss: 67Mb L: 24/24 MS: 4 ShuffleBytes-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:11.649 [2024-11-29 05:33:22.833447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.649 [2024-11-29 05:33:22.833485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.649 [2024-11-29 05:33:22.833621] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.649 [2024-11-29 05:33:22.833653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.649 [2024-11-29 05:33:22.833776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.649 [2024-11-29 05:33:22.833795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.649 #20 NEW cov: 11689 ft: 12197 corp: 3/48b lim: 35 exec/s: 0 rss: 67Mb L: 23/24 MS: 4 ChangeByte-ChangeASCIIInt-ChangeBit-CrossOver- 00:08:11.649 [2024-11-29 05:33:22.873232] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.649 [2024-11-29 05:33:22.873260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.649 [2024-11-29 05:33:22.873390] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.649 [2024-11-29 05:33:22.873406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.649 #21 NEW cov: 11695 ft: 12723 corp: 4/62b lim: 35 exec/s: 0 rss: 67Mb L: 14/24 MS: 1 EraseBytes- 00:08:11.649 [2024-11-29 05:33:22.923633] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.649 [2024-11-29 05:33:22.923663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.649 [2024-11-29 05:33:22.923809] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.649 [2024-11-29 05:33:22.923827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.649 [2024-11-29 05:33:22.923955] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.649 [2024-11-29 05:33:22.923975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.649 #22 NEW cov: 11780 ft: 13026 corp: 5/85b lim: 35 exec/s: 0 rss: 67Mb L: 23/24 MS: 1 ChangeByte- 00:08:11.909 [2024-11-29 05:33:22.963703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:22.963731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:22.963874] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:22.963895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:22.964022] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:22.964040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.909 #28 NEW cov: 11780 ft: 13155 corp: 6/109b lim: 35 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 CopyPart- 00:08:11.909 [2024-11-29 05:33:23.003800] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.003828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:23.003963] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.003980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:23.004109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.004126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.909 #29 NEW cov: 11780 ft: 13247 corp: 7/134b lim: 35 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 InsertByte- 00:08:11.909 [2024-11-29 05:33:23.043981] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.044009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:23.044136] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.044153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:23.044279] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.044297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.909 #30 NEW cov: 11780 ft: 13294 corp: 8/158b lim: 35 exec/s: 0 rss: 67Mb L: 24/25 MS: 1 InsertByte- 00:08:11.909 [2024-11-29 05:33:23.083774] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.083803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:23.083939] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000031 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.083957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.909 #31 NEW cov: 11780 ft: 13354 corp: 9/172b lim: 35 exec/s: 0 rss: 67Mb L: 14/25 MS: 1 ChangeASCIIInt- 00:08:11.909 [2024-11-29 05:33:23.134549] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.134576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:23.134701] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.134716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:23.134845] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.134862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:23.134987] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.135006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.909 #32 NEW cov: 11780 ft: 13660 corp: 10/205b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 CMP- DE: "\322'\010\002\000\000\000\000"- 00:08:11.909 [2024-11-29 05:33:23.184618] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.184647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:23.184786] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.184804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:23.184942] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.909 [2024-11-29 05:33:23.184960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.909 [2024-11-29 05:33:23.185085] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.910 [2024-11-29 05:33:23.185102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.910 #33 NEW cov: 11780 ft: 13691 corp: 11/238b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 CopyPart- 00:08:12.169 [2024-11-29 05:33:23.224292] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.169 [2024-11-29 05:33:23.224320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.169 [2024-11-29 05:33:23.224468] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.169 [2024-11-29 05:33:23.224485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.169 #34 NEW cov: 11780 ft: 13737 corp: 12/257b lim: 35 exec/s: 0 rss: 67Mb L: 19/33 MS: 1 EraseBytes- 00:08:12.169 [2024-11-29 05:33:23.274984] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.169 [2024-11-29 05:33:23.275012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.169 [2024-11-29 05:33:23.275147] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.169 [2024-11-29 05:33:23.275167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.169 [2024-11-29 05:33:23.275300] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.169 [2024-11-29 05:33:23.275318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.169 [2024-11-29 05:33:23.275416] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.169 [2024-11-29 05:33:23.275434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.169 #35 NEW cov: 11780 ft: 13752 corp: 13/289b lim: 35 exec/s: 0 rss: 67Mb L: 32/33 MS: 1 InsertRepeatedBytes- 00:08:12.170 [2024-11-29 05:33:23.325110] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.325137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.170 [2024-11-29 05:33:23.325266] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.325284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.170 [2024-11-29 05:33:23.325415] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.325432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.170 [2024-11-29 05:33:23.325562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.325579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.170 #36 NEW cov: 11780 ft: 13855 corp: 14/322b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 PersAutoDict- DE: "\322'\010\002\000\000\000\000"- 00:08:12.170 [2024-11-29 05:33:23.375037] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.375065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.170 [2024-11-29 05:33:23.375198] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.375215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.170 [2024-11-29 05:33:23.375349] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.375364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.170 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:12.170 #37 NEW cov: 11803 ft: 13914 corp: 15/347b lim: 35 exec/s: 0 rss: 68Mb L: 25/33 MS: 1 ChangeByte- 00:08:12.170 [2024-11-29 05:33:23.414905] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.414933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.170 [2024-11-29 05:33:23.415062] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.415080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.170 [2024-11-29 05:33:23.415209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.415228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.170 [2024-11-29 05:33:23.415353] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.415372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.170 #38 NEW cov: 11803 ft: 13956 corp: 16/379b lim: 35 exec/s: 0 rss: 68Mb L: 32/33 MS: 1 ChangeBit- 00:08:12.170 [2024-11-29 05:33:23.465512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.465538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.170 [2024-11-29 05:33:23.465670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.465687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.170 [2024-11-29 05:33:23.465823] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.465839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.170 [2024-11-29 05:33:23.465968] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.170 [2024-11-29 05:33:23.465985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.430 #39 NEW cov: 11803 ft: 13985 corp: 17/412b lim: 35 exec/s: 39 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:12.430 [2024-11-29 05:33:23.505674] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.505702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.430 [2024-11-29 05:33:23.505830] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.505847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.430 [2024-11-29 05:33:23.505980] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.505997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.430 [2024-11-29 05:33:23.506131] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.506148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.430 #40 NEW cov: 11803 ft: 14023 corp: 18/444b lim: 35 exec/s: 40 rss: 68Mb L: 32/33 MS: 1 ChangeBit- 00:08:12.430 [2024-11-29 05:33:23.544462] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.544491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.430 #41 NEW cov: 11803 ft: 14761 corp: 19/456b lim: 35 exec/s: 41 rss: 68Mb L: 12/33 MS: 1 EraseBytes- 00:08:12.430 [2024-11-29 05:33:23.585261] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.585289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.430 [2024-11-29 05:33:23.585417] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000031 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.585434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.430 #42 NEW cov: 11803 ft: 14794 corp: 20/470b lim: 35 exec/s: 42 rss: 68Mb L: 14/33 MS: 1 ChangeByte- 00:08:12.430 [2024-11-29 05:33:23.625755] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.625785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.430 [2024-11-29 05:33:23.625923] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.625939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.430 [2024-11-29 05:33:23.626077] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.626094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.430 #43 NEW cov: 11803 ft: 14856 corp: 21/493b lim: 35 exec/s: 43 rss: 68Mb L: 23/33 MS: 1 ShuffleBytes- 00:08:12.430 [2024-11-29 05:33:23.666020] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.666047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.430 [2024-11-29 05:33:23.666180] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.666196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.430 [2024-11-29 05:33:23.666318] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.666336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.430 [2024-11-29 05:33:23.666461] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.666478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.430 #44 NEW cov: 11803 ft: 14888 corp: 22/526b lim: 35 exec/s: 44 rss: 68Mb L: 33/33 MS: 1 CMP- DE: "\000\223\257\217B^\235\206"- 00:08:12.430 [2024-11-29 05:33:23.706247] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000a7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.706279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.430 [2024-11-29 05:33:23.706405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.430 [2024-11-29 05:33:23.706422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.431 [2024-11-29 05:33:23.706544] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.431 [2024-11-29 05:33:23.706561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.431 [2024-11-29 05:33:23.706699] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.431 [2024-11-29 05:33:23.706715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.431 #45 NEW cov: 11810 ft: 14974 corp: 23/554b lim: 35 exec/s: 45 rss: 68Mb L: 28/33 MS: 1 InsertRepeatedBytes- 00:08:12.690 [2024-11-29 05:33:23.746363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.690 [2024-11-29 05:33:23.746391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.690 [2024-11-29 05:33:23.746515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.690 [2024-11-29 05:33:23.746532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.690 [2024-11-29 05:33:23.746676] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000093 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.690 [2024-11-29 05:33:23.746703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.690 [2024-11-29 05:33:23.746832] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.690 [2024-11-29 05:33:23.746849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.690 #46 NEW cov: 11810 ft: 14986 corp: 24/585b lim: 35 exec/s: 46 rss: 68Mb L: 31/33 MS: 1 PersAutoDict- DE: "\000\223\257\217B^\235\206"- 00:08:12.690 [2024-11-29 05:33:23.786014] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.690 [2024-11-29 05:33:23.786042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.691 [2024-11-29 05:33:23.786162] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.691 [2024-11-29 05:33:23.786179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.691 #47 NEW cov: 11810 ft: 15006 corp: 25/599b lim: 35 exec/s: 47 rss: 68Mb L: 14/33 MS: 1 CopyPart- 00:08:12.691 [2024-11-29 05:33:23.826323] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.691 [2024-11-29 05:33:23.826352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.691 [2024-11-29 05:33:23.826486] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000af SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.691 [2024-11-29 05:33:23.826507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.691 [2024-11-29 05:33:23.826637] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.691 [2024-11-29 05:33:23.826653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.691 #48 NEW cov: 11810 ft: 15026 corp: 26/622b lim: 35 exec/s: 48 rss: 68Mb L: 23/33 MS: 1 EraseBytes- 00:08:12.691 [2024-11-29 05:33:23.865699] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.691 [2024-11-29 05:33:23.865729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.691 [2024-11-29 05:33:23.865866] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000031 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.691 [2024-11-29 05:33:23.865882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.691 #49 NEW cov: 11810 ft: 15037 corp: 27/636b lim: 35 exec/s: 49 rss: 68Mb L: 14/33 MS: 1 ChangeBit- 00:08:12.691 [2024-11-29 05:33:23.906054] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.691 [2024-11-29 05:33:23.906080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.691 #50 NEW cov: 11810 ft: 15047 corp: 28/645b lim: 35 exec/s: 50 rss: 68Mb L: 9/33 MS: 1 EraseBytes- 00:08:12.691 [2024-11-29 05:33:23.956715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.691 [2024-11-29 05:33:23.956745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.691 [2024-11-29 05:33:23.956881] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.691 [2024-11-29 05:33:23.956899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.691 [2024-11-29 05:33:23.957024] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.691 [2024-11-29 05:33:23.957041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.691 #51 NEW cov: 11810 ft: 15109 corp: 29/667b lim: 35 exec/s: 51 rss: 68Mb L: 22/33 MS: 1 InsertRepeatedBytes- 00:08:12.949 [2024-11-29 05:33:24.006659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.949 [2024-11-29 05:33:24.006689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.949 [2024-11-29 05:33:24.006821] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.949 [2024-11-29 05:33:24.006842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.949 #52 NEW cov: 11810 ft: 15120 corp: 30/681b lim: 35 exec/s: 52 rss: 68Mb L: 14/33 MS: 1 ShuffleBytes- 00:08:12.949 [2024-11-29 05:33:24.046762] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.949 [2024-11-29 05:33:24.046790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.949 [2024-11-29 05:33:24.046914] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.949 [2024-11-29 05:33:24.046931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.949 #53 NEW cov: 11810 ft: 15134 corp: 31/698b lim: 35 exec/s: 53 rss: 68Mb L: 17/33 MS: 1 EraseBytes- 00:08:12.949 [2024-11-29 05:33:24.087395] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.949 [2024-11-29 05:33:24.087422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.949 [2024-11-29 05:33:24.087550] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.949 [2024-11-29 05:33:24.087567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.949 [2024-11-29 05:33:24.087695] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.949 [2024-11-29 05:33:24.087711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.949 [2024-11-29 05:33:24.087829] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.949 [2024-11-29 05:33:24.087846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.949 #54 NEW cov: 11810 ft: 15176 corp: 32/732b lim: 35 exec/s: 54 rss: 69Mb L: 34/34 MS: 1 CrossOver- 00:08:12.950 [2024-11-29 05:33:24.126817] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.950 [2024-11-29 05:33:24.126846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.950 #55 NEW cov: 11810 ft: 15177 corp: 33/741b lim: 35 exec/s: 55 rss: 69Mb L: 9/34 MS: 1 ChangeByte- 00:08:12.950 [2024-11-29 05:33:24.167383] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.950 [2024-11-29 05:33:24.167411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.950 [2024-11-29 05:33:24.167542] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.950 [2024-11-29 05:33:24.167561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.950 [2024-11-29 05:33:24.167687] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.950 [2024-11-29 05:33:24.167704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.950 [2024-11-29 05:33:24.207478] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.950 [2024-11-29 05:33:24.207506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.950 [2024-11-29 05:33:24.207638] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.950 [2024-11-29 05:33:24.207656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.950 [2024-11-29 05:33:24.207779] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.950 [2024-11-29 05:33:24.207796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.950 #57 NEW cov: 11810 ft: 15217 corp: 34/765b lim: 35 exec/s: 57 rss: 69Mb L: 24/34 MS: 2 InsertByte-ChangeBinInt- 00:08:13.209 [2024-11-29 05:33:24.257692] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.257721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.209 [2024-11-29 05:33:24.257847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.257864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.209 [2024-11-29 05:33:24.257983] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.258000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.209 #58 NEW cov: 11810 ft: 15226 corp: 35/788b lim: 35 exec/s: 58 rss: 69Mb L: 23/34 MS: 1 ShuffleBytes- 00:08:13.209 [2024-11-29 05:33:24.297533] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.297561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.209 NEW_FUNC[1/2]: 0x47fd88 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:13.209 NEW_FUNC[2/2]: 0x1145778 in nvmf_ctrlr_set_features_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1489 00:08:13.209 #60 NEW cov: 11867 ft: 15299 corp: 36/806b lim: 35 exec/s: 60 rss: 69Mb L: 18/34 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:13.209 [2024-11-29 05:33:24.347911] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.347942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.209 [2024-11-29 05:33:24.348075] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.348092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.209 [2024-11-29 05:33:24.348215] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.348231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.209 #61 NEW cov: 11867 ft: 15308 corp: 37/830b lim: 35 exec/s: 61 rss: 69Mb L: 24/34 MS: 1 EraseBytes- 00:08:13.209 [2024-11-29 05:33:24.387745] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.387773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.209 [2024-11-29 05:33:24.387908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.387924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.209 #62 NEW cov: 11867 ft: 15313 corp: 38/850b lim: 35 exec/s: 62 rss: 69Mb L: 20/34 MS: 1 InsertByte- 00:08:13.209 [2024-11-29 05:33:24.427919] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.427947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.209 [2024-11-29 05:33:24.428072] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000031 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.428101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.209 #63 NEW cov: 11867 ft: 15322 corp: 39/864b lim: 35 exec/s: 63 rss: 69Mb L: 14/34 MS: 1 ChangeBit- 00:08:13.209 [2024-11-29 05:33:24.478109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.478136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.209 [2024-11-29 05:33:24.478271] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.478290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.209 [2024-11-29 05:33:24.478425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000d2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.209 [2024-11-29 05:33:24.478443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.210 [2024-11-29 05:33:24.478568] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.210 [2024-11-29 05:33:24.478585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.210 #64 pulse cov: 11867 ft: 15331 corp: 39/864b lim: 35 exec/s: 32 rss: 69Mb 00:08:13.210 #64 NEW cov: 11867 ft: 15331 corp: 40/898b lim: 35 exec/s: 32 rss: 69Mb L: 34/34 MS: 1 InsertByte- 00:08:13.210 #64 DONE cov: 11867 ft: 15331 corp: 40/898b lim: 35 exec/s: 32 rss: 69Mb 00:08:13.210 ###### Recommended dictionary. ###### 00:08:13.210 "\322'\010\002\000\000\000\000" # Uses: 1 00:08:13.210 "\000\223\257\217B^\235\206" # Uses: 1 00:08:13.210 ###### End of recommended dictionary. ###### 00:08:13.210 Done 64 runs in 2 second(s) 00:08:13.469 05:33:24 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:13.469 05:33:24 -- ../common.sh@72 -- # (( i++ )) 00:08:13.469 05:33:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.469 05:33:24 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:13.469 05:33:24 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:13.469 05:33:24 -- nvmf/run.sh@24 -- # local timen=1 00:08:13.469 05:33:24 -- nvmf/run.sh@25 -- # local core=0x1 00:08:13.469 05:33:24 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:13.469 05:33:24 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:13.469 05:33:24 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:13.469 05:33:24 -- nvmf/run.sh@29 -- # port=4415 00:08:13.469 05:33:24 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:13.469 05:33:24 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:13.469 05:33:24 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:13.469 05:33:24 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:13.469 [2024-11-29 05:33:24.661531] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:13.469 [2024-11-29 05:33:24.661632] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2220346 ] 00:08:13.469 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.728 [2024-11-29 05:33:24.837235] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.728 [2024-11-29 05:33:24.857085] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:13.728 [2024-11-29 05:33:24.857203] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.728 [2024-11-29 05:33:24.908397] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.728 [2024-11-29 05:33:24.924730] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:13.728 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.728 INFO: Seed: 2548463250 00:08:13.728 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:13.728 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:13.728 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:13.728 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.728 #2 INITED exec/s: 0 rss: 59Mb 00:08:13.728 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.728 This may also happen if the target rejected all inputs we tried so far 00:08:13.728 [2024-11-29 05:33:24.970149] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.728 [2024-11-29 05:33:24.970178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.728 [2024-11-29 05:33:24.970234] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.728 [2024-11-29 05:33:24.970248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.012 NEW_FUNC[1/671]: 0x466a08 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:14.012 NEW_FUNC[2/671]: 0x4868f8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:14.012 #4 NEW cov: 11577 ft: 11579 corp: 2/26b lim: 35 exec/s: 0 rss: 67Mb L: 25/25 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:14.012 [2024-11-29 05:33:25.280951] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000043a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.012 [2024-11-29 05:33:25.280982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.012 [2024-11-29 05:33:25.281059] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.012 [2024-11-29 05:33:25.281073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.012 [2024-11-29 05:33:25.281135] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.012 [2024-11-29 05:33:25.281149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.012 #9 NEW cov: 11691 ft: 12132 corp: 3/48b lim: 35 exec/s: 0 rss: 67Mb L: 22/25 MS: 5 InsertByte-EraseBytes-ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:14.272 [2024-11-29 05:33:25.320934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000043a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.320961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.272 [2024-11-29 05:33:25.321037] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.321051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.272 [2024-11-29 05:33:25.321110] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.321124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.272 #10 NEW cov: 11697 ft: 12336 corp: 4/70b lim: 35 exec/s: 0 rss: 67Mb L: 22/25 MS: 1 ChangeBinInt- 00:08:14.272 [2024-11-29 05:33:25.361143] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.361167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.272 [2024-11-29 05:33:25.361241] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.361254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.272 #11 NEW cov: 11782 ft: 12605 corp: 5/95b lim: 35 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 ChangeBit- 00:08:14.272 [2024-11-29 05:33:25.401228] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000063a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.401254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.272 [2024-11-29 05:33:25.401300] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.401313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.272 [2024-11-29 05:33:25.401373] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.401386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.272 #12 NEW cov: 11782 ft: 12750 corp: 6/117b lim: 35 exec/s: 0 rss: 67Mb L: 22/25 MS: 1 CMP- DE: "\325'\010\002\000\000\000\000"- 00:08:14.272 [2024-11-29 05:33:25.441324] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.441349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.272 [2024-11-29 05:33:25.441408] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.441422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.272 [2024-11-29 05:33:25.441478] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.441491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.272 #13 NEW cov: 11782 ft: 12777 corp: 7/139b lim: 35 exec/s: 0 rss: 67Mb L: 22/25 MS: 1 CMP- DE: "\377\377\377\000"- 00:08:14.272 [2024-11-29 05:33:25.481603] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.481628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.272 [2024-11-29 05:33:25.481701] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.481715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.272 [2024-11-29 05:33:25.481783] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.481796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.272 #14 NEW cov: 11782 ft: 13177 corp: 8/172b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:14.272 [2024-11-29 05:33:25.521440] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.521465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.272 #15 NEW cov: 11782 ft: 13518 corp: 9/186b lim: 35 exec/s: 0 rss: 67Mb L: 14/33 MS: 1 InsertRepeatedBytes- 00:08:14.272 [2024-11-29 05:33:25.561683] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000043a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.561707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.272 [2024-11-29 05:33:25.561792] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.561805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.272 [2024-11-29 05:33:25.561880] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.272 [2024-11-29 05:33:25.561893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.530 #16 NEW cov: 11782 ft: 13531 corp: 10/208b lim: 35 exec/s: 0 rss: 67Mb L: 22/33 MS: 1 ChangeBit- 00:08:14.530 [2024-11-29 05:33:25.601757] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.530 [2024-11-29 05:33:25.601782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.530 [2024-11-29 05:33:25.601858] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.530 [2024-11-29 05:33:25.601875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.530 [2024-11-29 05:33:25.601934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.530 [2024-11-29 05:33:25.601948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.530 #17 NEW cov: 11782 ft: 13607 corp: 11/230b lim: 35 exec/s: 0 rss: 67Mb L: 22/33 MS: 1 ShuffleBytes- 00:08:14.530 [2024-11-29 05:33:25.641948] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000043a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.530 [2024-11-29 05:33:25.641973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.530 [2024-11-29 05:33:25.642049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.530 [2024-11-29 05:33:25.642063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.530 [2024-11-29 05:33:25.642120] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.530 [2024-11-29 05:33:25.642134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.530 #18 NEW cov: 11782 ft: 13632 corp: 12/252b lim: 35 exec/s: 0 rss: 67Mb L: 22/33 MS: 1 ChangeByte- 00:08:14.530 [2024-11-29 05:33:25.682145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.530 [2024-11-29 05:33:25.682170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.530 [2024-11-29 05:33:25.682228] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.530 [2024-11-29 05:33:25.682242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.530 [2024-11-29 05:33:25.682301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000126 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.530 [2024-11-29 05:33:25.682314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.531 [2024-11-29 05:33:25.682373] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000126 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.531 [2024-11-29 05:33:25.682386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.531 #19 NEW cov: 11782 ft: 13806 corp: 13/286b lim: 35 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:14.531 [2024-11-29 05:33:25.722138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000043a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.531 [2024-11-29 05:33:25.722163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.531 [2024-11-29 05:33:25.722219] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.531 [2024-11-29 05:33:25.722233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.531 [2024-11-29 05:33:25.722290] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.531 [2024-11-29 05:33:25.722304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.531 #20 NEW cov: 11782 ft: 13814 corp: 14/312b lim: 35 exec/s: 0 rss: 67Mb L: 26/34 MS: 1 PersAutoDict- DE: "\377\377\377\000"- 00:08:14.531 [2024-11-29 05:33:25.762221] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000043a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.531 [2024-11-29 05:33:25.762246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.531 [2024-11-29 05:33:25.762305] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.531 [2024-11-29 05:33:25.762318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.531 [2024-11-29 05:33:25.762377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.531 [2024-11-29 05:33:25.762390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.531 #21 NEW cov: 11782 ft: 13882 corp: 15/334b lim: 35 exec/s: 0 rss: 68Mb L: 22/34 MS: 1 ChangeBinInt- 00:08:14.531 [2024-11-29 05:33:25.802471] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.531 [2024-11-29 05:33:25.802496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.531 [2024-11-29 05:33:25.802566] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.531 [2024-11-29 05:33:25.802579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.531 [2024-11-29 05:33:25.802643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.531 [2024-11-29 05:33:25.802656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.531 [2024-11-29 05:33:25.802715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.531 [2024-11-29 05:33:25.802728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.531 #22 NEW cov: 11782 ft: 13898 corp: 16/368b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 CrossOver- 00:08:14.789 [2024-11-29 05:33:25.842480] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000063a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.789 [2024-11-29 05:33:25.842504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.789 [2024-11-29 05:33:25.842565] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.789 [2024-11-29 05:33:25.842578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.789 [2024-11-29 05:33:25.842636] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.789 [2024-11-29 05:33:25.842650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.789 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:14.789 #23 NEW cov: 11805 ft: 13922 corp: 17/391b lim: 35 exec/s: 0 rss: 68Mb L: 23/34 MS: 1 InsertByte- 00:08:14.789 [2024-11-29 05:33:25.882583] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.789 [2024-11-29 05:33:25.882612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.789 [2024-11-29 05:33:25.882675] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.789 [2024-11-29 05:33:25.882689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.789 NEW_FUNC[1/1]: 0x481108 in feat_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:282 00:08:14.789 #24 NEW cov: 11828 ft: 13999 corp: 18/413b lim: 35 exec/s: 0 rss: 68Mb L: 22/34 MS: 1 PersAutoDict- DE: "\325'\010\002\000\000\000\000"- 00:08:14.789 [2024-11-29 05:33:25.923087] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000731 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.789 [2024-11-29 05:33:25.923112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.789 [2024-11-29 05:33:25.923174] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.789 [2024-11-29 05:33:25.923188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.790 [2024-11-29 05:33:25.923248] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.790 [2024-11-29 05:33:25.923261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.790 #25 NEW cov: 11828 ft: 14087 corp: 19/447b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 InsertByte- 00:08:14.790 [2024-11-29 05:33:25.962889] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.790 [2024-11-29 05:33:25.962914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.790 [2024-11-29 05:33:25.962990] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.790 [2024-11-29 05:33:25.963004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.790 NEW_FUNC[1/1]: 0x47fd88 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:14.790 #26 NEW cov: 11866 ft: 14129 corp: 20/469b lim: 35 exec/s: 26 rss: 68Mb L: 22/34 MS: 1 ShuffleBytes- 00:08:14.790 [2024-11-29 05:33:26.002820] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000001d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.790 [2024-11-29 05:33:26.002846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.790 [2024-11-29 05:33:26.002916] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.790 [2024-11-29 05:33:26.002930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.790 #27 NEW cov: 11866 ft: 14204 corp: 21/483b lim: 35 exec/s: 27 rss: 68Mb L: 14/34 MS: 1 PersAutoDict- DE: "\325'\010\002\000\000\000\000"- 00:08:14.790 [2024-11-29 05:33:26.043216] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.790 [2024-11-29 05:33:26.043240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.790 [2024-11-29 05:33:26.043372] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.790 [2024-11-29 05:33:26.043386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.790 [2024-11-29 05:33:26.043443] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000126 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.790 [2024-11-29 05:33:26.043459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.790 #28 NEW cov: 11866 ft: 14238 corp: 22/517b lim: 35 exec/s: 28 rss: 68Mb L: 34/34 MS: 1 CopyPart- 00:08:14.790 [2024-11-29 05:33:26.083195] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.790 [2024-11-29 05:33:26.083220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.048 NEW_FUNC[1/1]: 0x4852b8 in feat_interrupt_coalescing /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:325 00:08:15.048 #29 NEW cov: 11888 ft: 14335 corp: 23/539b lim: 35 exec/s: 29 rss: 68Mb L: 22/34 MS: 1 ChangeBinInt- 00:08:15.048 [2024-11-29 05:33:26.123364] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.048 [2024-11-29 05:33:26.123388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.048 [2024-11-29 05:33:26.123443] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.048 [2024-11-29 05:33:26.123456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.048 #30 NEW cov: 11888 ft: 14349 corp: 24/565b lim: 35 exec/s: 30 rss: 68Mb L: 26/34 MS: 1 InsertByte- 00:08:15.048 [2024-11-29 05:33:26.163280] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000043a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.048 [2024-11-29 05:33:26.163305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.048 [2024-11-29 05:33:26.163360] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.048 [2024-11-29 05:33:26.163373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.048 #31 NEW cov: 11888 ft: 14433 corp: 25/584b lim: 35 exec/s: 31 rss: 68Mb L: 19/34 MS: 1 EraseBytes- 00:08:15.048 [2024-11-29 05:33:26.203492] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000043a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.048 [2024-11-29 05:33:26.203517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.048 [2024-11-29 05:33:26.203591] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.048 [2024-11-29 05:33:26.203616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.048 [2024-11-29 05:33:26.203674] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.048 [2024-11-29 05:33:26.203688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.049 #32 NEW cov: 11888 ft: 14476 corp: 26/606b lim: 35 exec/s: 32 rss: 68Mb L: 22/34 MS: 1 ChangeBit- 00:08:15.049 [2024-11-29 05:33:26.243686] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.049 [2024-11-29 05:33:26.243711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.049 [2024-11-29 05:33:26.243767] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.049 [2024-11-29 05:33:26.243781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.049 #33 NEW cov: 11888 ft: 14533 corp: 27/632b lim: 35 exec/s: 33 rss: 68Mb L: 26/34 MS: 1 InsertByte- 00:08:15.049 [2024-11-29 05:33:26.283824] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.049 [2024-11-29 05:33:26.283849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.049 [2024-11-29 05:33:26.283903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.049 [2024-11-29 05:33:26.283916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.049 #34 NEW cov: 11888 ft: 14609 corp: 28/654b lim: 35 exec/s: 34 rss: 69Mb L: 22/34 MS: 1 CopyPart- 00:08:15.049 [2024-11-29 05:33:26.324209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.049 [2024-11-29 05:33:26.324233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.049 [2024-11-29 05:33:26.324291] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.049 [2024-11-29 05:33:26.324305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.049 [2024-11-29 05:33:26.324359] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES TIMESTAMP cid:6 cdw10:0000010e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.049 [2024-11-29 05:33:26.324372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.049 [2024-11-29 05:33:26.324432] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000126 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.049 [2024-11-29 05:33:26.324445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.049 [2024-11-29 05:33:26.324505] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.049 [2024-11-29 05:33:26.324518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.049 #35 NEW cov: 11888 ft: 14670 corp: 29/689b lim: 35 exec/s: 35 rss: 69Mb L: 35/35 MS: 1 InsertByte- 00:08:15.308 [2024-11-29 05:33:26.364076] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.364101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.308 [2024-11-29 05:33:26.364159] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.364173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.308 #36 NEW cov: 11888 ft: 14689 corp: 30/715b lim: 35 exec/s: 36 rss: 69Mb L: 26/35 MS: 1 InsertByte- 00:08:15.308 [2024-11-29 05:33:26.404184] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.404210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.308 [2024-11-29 05:33:26.404265] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.404278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.308 #37 NEW cov: 11888 ft: 14690 corp: 31/740b lim: 35 exec/s: 37 rss: 69Mb L: 25/35 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:15.308 [2024-11-29 05:33:26.444217] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.444247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.308 [2024-11-29 05:33:26.444319] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.444333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.308 #38 NEW cov: 11888 ft: 14700 corp: 32/762b lim: 35 exec/s: 38 rss: 69Mb L: 22/35 MS: 1 ChangeBinInt- 00:08:15.308 [2024-11-29 05:33:26.484573] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.484602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.308 [2024-11-29 05:33:26.484673] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.484687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.308 [2024-11-29 05:33:26.484743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000245 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.484767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.308 #39 NEW cov: 11888 ft: 14706 corp: 33/795b lim: 35 exec/s: 39 rss: 69Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:08:15.308 [2024-11-29 05:33:26.524438] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000063a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.524462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.308 [2024-11-29 05:33:26.524519] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.524533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.308 [2024-11-29 05:33:26.524591] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.524608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.308 #40 NEW cov: 11888 ft: 14733 corp: 34/817b lim: 35 exec/s: 40 rss: 69Mb L: 22/35 MS: 1 ShuffleBytes- 00:08:15.308 [2024-11-29 05:33:26.564717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.564741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.308 [2024-11-29 05:33:26.564814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.564827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.308 [2024-11-29 05:33:26.564886] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000093 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.564900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.308 #41 NEW cov: 11888 ft: 14744 corp: 35/847b lim: 35 exec/s: 41 rss: 69Mb L: 30/35 MS: 1 CMP- DE: "\034\037\375\327\225\257\223\000"- 00:08:15.308 [2024-11-29 05:33:26.604906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.604934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.308 [2024-11-29 05:33:26.604994] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.605008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.308 [2024-11-29 05:33:26.605066] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.308 [2024-11-29 05:33:26.605079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.566 #42 NEW cov: 11888 ft: 14759 corp: 36/880b lim: 35 exec/s: 42 rss: 69Mb L: 33/35 MS: 1 CMP- DE: "\377\377\377\377\001\010'\325"- 00:08:15.567 [2024-11-29 05:33:26.644823] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000013a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.644849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.567 [2024-11-29 05:33:26.644903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000126 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.644917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.567 [2024-11-29 05:33:26.644972] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.644985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.567 #43 NEW cov: 11888 ft: 14771 corp: 37/903b lim: 35 exec/s: 43 rss: 69Mb L: 23/35 MS: 1 EraseBytes- 00:08:15.567 [2024-11-29 05:33:26.684980] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000071a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.685005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.567 [2024-11-29 05:33:26.685068] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.685082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.567 #44 NEW cov: 11888 ft: 14785 corp: 38/929b lim: 35 exec/s: 44 rss: 69Mb L: 26/35 MS: 1 ChangeBinInt- 00:08:15.567 [2024-11-29 05:33:26.725193] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.725217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.567 [2024-11-29 05:33:26.725271] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000001d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.725285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.567 [2024-11-29 05:33:26.725341] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.725354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.567 NEW_FUNC[1/1]: 0x1140898 in nvmf_ctrlr_get_features_host_behavior_support /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1823 00:08:15.567 #45 NEW cov: 11910 ft: 14823 corp: 39/959b lim: 35 exec/s: 45 rss: 69Mb L: 30/35 MS: 1 PersAutoDict- DE: "\377\377\377\377\001\010'\325"- 00:08:15.567 [2024-11-29 05:33:26.775338] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.775366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.567 [2024-11-29 05:33:26.775440] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.775453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.567 [2024-11-29 05:33:26.775512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.775525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.567 #46 NEW cov: 11910 ft: 14898 corp: 40/993b lim: 35 exec/s: 46 rss: 69Mb L: 34/35 MS: 1 InsertByte- 00:08:15.567 [2024-11-29 05:33:26.815495] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.815519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.567 [2024-11-29 05:33:26.815579] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.815592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.567 [2024-11-29 05:33:26.815655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000245 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.815668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.567 #47 NEW cov: 11910 ft: 14904 corp: 41/1026b lim: 35 exec/s: 47 rss: 69Mb L: 33/35 MS: 1 ShuffleBytes- 00:08:15.567 [2024-11-29 05:33:26.855662] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.855688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.567 [2024-11-29 05:33:26.855750] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.855764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.567 [2024-11-29 05:33:26.855823] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.567 [2024-11-29 05:33:26.855836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.827 #48 NEW cov: 11910 ft: 14937 corp: 42/1054b lim: 35 exec/s: 48 rss: 69Mb L: 28/35 MS: 1 InsertRepeatedBytes- 00:08:15.827 [2024-11-29 05:33:26.895327] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.827 [2024-11-29 05:33:26.895352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.827 #49 NEW cov: 11910 ft: 15138 corp: 43/1066b lim: 35 exec/s: 49 rss: 69Mb L: 12/35 MS: 1 EraseBytes- 00:08:15.827 [2024-11-29 05:33:26.935390] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.827 [2024-11-29 05:33:26.935414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.827 #53 NEW cov: 11910 ft: 15142 corp: 44/1073b lim: 35 exec/s: 26 rss: 69Mb L: 7/35 MS: 4 CrossOver-InsertByte-InsertByte-InsertByte- 00:08:15.827 #53 DONE cov: 11910 ft: 15142 corp: 44/1073b lim: 35 exec/s: 26 rss: 69Mb 00:08:15.827 ###### Recommended dictionary. ###### 00:08:15.827 "\325'\010\002\000\000\000\000" # Uses: 2 00:08:15.827 "\377\377\377\000" # Uses: 1 00:08:15.827 "\001\000\000\000" # Uses: 0 00:08:15.827 "\034\037\375\327\225\257\223\000" # Uses: 0 00:08:15.827 "\377\377\377\377\001\010'\325" # Uses: 1 00:08:15.827 ###### End of recommended dictionary. ###### 00:08:15.827 Done 53 runs in 2 second(s) 00:08:15.827 05:33:27 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:15.827 05:33:27 -- ../common.sh@72 -- # (( i++ )) 00:08:15.827 05:33:27 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.827 05:33:27 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:15.827 05:33:27 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:15.827 05:33:27 -- nvmf/run.sh@24 -- # local timen=1 00:08:15.827 05:33:27 -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.827 05:33:27 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:15.827 05:33:27 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:15.827 05:33:27 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:15.827 05:33:27 -- nvmf/run.sh@29 -- # port=4416 00:08:15.827 05:33:27 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:15.827 05:33:27 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:15.827 05:33:27 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.827 05:33:27 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:15.827 [2024-11-29 05:33:27.120481] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:15.827 [2024-11-29 05:33:27.120552] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2220759 ] 00:08:16.086 EAL: No free 2048 kB hugepages reported on node 1 00:08:16.086 [2024-11-29 05:33:27.303623] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.086 [2024-11-29 05:33:27.324425] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:16.086 [2024-11-29 05:33:27.324548] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.086 [2024-11-29 05:33:27.375818] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:16.344 [2024-11-29 05:33:27.392161] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:16.344 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.344 INFO: Seed: 721516703 00:08:16.344 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:16.344 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:16.344 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:16.344 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.344 #2 INITED exec/s: 0 rss: 59Mb 00:08:16.344 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.344 This may also happen if the target rejected all inputs we tried so far 00:08:16.344 [2024-11-29 05:33:27.437497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.344 [2024-11-29 05:33:27.437527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.344 [2024-11-29 05:33:27.437565] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.344 [2024-11-29 05:33:27.437581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.344 [2024-11-29 05:33:27.437635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.344 [2024-11-29 05:33:27.437655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.344 [2024-11-29 05:33:27.437705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.344 [2024-11-29 05:33:27.437719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.604 NEW_FUNC[1/671]: 0x467ec8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:16.604 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:16.604 #6 NEW cov: 11667 ft: 11668 corp: 2/85b lim: 105 exec/s: 0 rss: 67Mb L: 84/84 MS: 4 CMP-ChangeBit-EraseBytes-InsertRepeatedBytes- DE: "\001\000\000@"- 00:08:16.604 [2024-11-29 05:33:27.748264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.748297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.604 [2024-11-29 05:33:27.748332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.748347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.604 [2024-11-29 05:33:27.748400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.748415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.604 [2024-11-29 05:33:27.748468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.748482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.604 #7 NEW cov: 11780 ft: 12149 corp: 3/169b lim: 105 exec/s: 0 rss: 67Mb L: 84/84 MS: 1 ShuffleBytes- 00:08:16.604 [2024-11-29 05:33:27.798212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582438655 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.798240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.604 [2024-11-29 05:33:27.798280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.798294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.604 [2024-11-29 05:33:27.798346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551370 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.798361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.604 #9 NEW cov: 11786 ft: 12766 corp: 4/242b lim: 105 exec/s: 0 rss: 67Mb L: 73/84 MS: 2 CopyPart-CrossOver- 00:08:16.604 [2024-11-29 05:33:27.838126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070253641727 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.838153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.604 #13 NEW cov: 11871 ft: 13525 corp: 5/263b lim: 105 exec/s: 0 rss: 67Mb L: 21/84 MS: 4 InsertByte-ChangeBinInt-ChangeByte-CrossOver- 00:08:16.604 [2024-11-29 05:33:27.878569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.878601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.604 [2024-11-29 05:33:27.878651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.878667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.604 [2024-11-29 05:33:27.878718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.878732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.604 [2024-11-29 05:33:27.878784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.604 [2024-11-29 05:33:27.878799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.864 #14 NEW cov: 11871 ft: 13738 corp: 6/347b lim: 105 exec/s: 0 rss: 67Mb L: 84/84 MS: 1 CMP- DE: "\001\000\000\000\000\000\004\000"- 00:08:16.864 [2024-11-29 05:33:27.918696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.918722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:27.918787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.918803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:27.918856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.918871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:27.918923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.918938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.864 #19 NEW cov: 11871 ft: 13850 corp: 7/447b lim: 105 exec/s: 0 rss: 67Mb L: 100/100 MS: 5 InsertByte-ChangeBinInt-InsertByte-EraseBytes-InsertRepeatedBytes- 00:08:16.864 [2024-11-29 05:33:27.958845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.958872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:27.958936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.958952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:27.959002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.959017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:27.959076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.959091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.864 #20 NEW cov: 11871 ft: 13915 corp: 8/538b lim: 105 exec/s: 0 rss: 67Mb L: 91/100 MS: 1 CrossOver- 00:08:16.864 [2024-11-29 05:33:27.998895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.998922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:27.998970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.998986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:27.999037] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.999067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:27.999121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:27.999136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.864 #21 NEW cov: 11871 ft: 13978 corp: 9/622b lim: 105 exec/s: 0 rss: 67Mb L: 84/100 MS: 1 ShuffleBytes- 00:08:16.864 [2024-11-29 05:33:28.038898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582357504 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.038926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:28.038965] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.038980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:28.039034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.039051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.864 #28 NEW cov: 11871 ft: 14036 corp: 10/685b lim: 105 exec/s: 0 rss: 67Mb L: 63/100 MS: 2 CrossOver-CrossOver- 00:08:16.864 [2024-11-29 05:33:28.079139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.079168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:28.079206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.079219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:28.079271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.079287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:28.079342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.079357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.864 #29 NEW cov: 11871 ft: 14143 corp: 11/789b lim: 105 exec/s: 0 rss: 67Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:08:16.864 [2024-11-29 05:33:28.119330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.119357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:28.119399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.119415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:28.119469] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.119485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:28.119535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.119549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.864 #30 NEW cov: 11871 ft: 14179 corp: 12/880b lim: 105 exec/s: 0 rss: 67Mb L: 91/104 MS: 1 ShuffleBytes- 00:08:16.864 [2024-11-29 05:33:28.159302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582438655 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.159328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:28.159384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.159400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.864 [2024-11-29 05:33:28.159454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551370 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.864 [2024-11-29 05:33:28.159468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.124 #31 NEW cov: 11871 ft: 14227 corp: 13/953b lim: 105 exec/s: 0 rss: 68Mb L: 73/104 MS: 1 PersAutoDict- DE: "\001\000\000@"- 00:08:17.124 [2024-11-29 05:33:28.199515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.124 [2024-11-29 05:33:28.199541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.124 [2024-11-29 05:33:28.199584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.124 [2024-11-29 05:33:28.199596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.199657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.199676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.199731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.199746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.125 #32 NEW cov: 11871 ft: 14283 corp: 14/1037b lim: 105 exec/s: 0 rss: 68Mb L: 84/104 MS: 1 ShuffleBytes- 00:08:17.125 [2024-11-29 05:33:28.239606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.239633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.239688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.239703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.239756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.239770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.239821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.239836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.125 #33 NEW cov: 11871 ft: 14334 corp: 15/1121b lim: 105 exec/s: 0 rss: 68Mb L: 84/104 MS: 1 CopyPart- 00:08:17.125 [2024-11-29 05:33:28.269590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.269619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.269657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.269672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.269724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.269738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.125 #34 NEW cov: 11871 ft: 14395 corp: 16/1191b lim: 105 exec/s: 0 rss: 68Mb L: 70/104 MS: 1 EraseBytes- 00:08:17.125 [2024-11-29 05:33:28.309824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.309852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.309904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.309920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.309971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.309990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.310043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.310058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.125 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:17.125 #35 NEW cov: 11894 ft: 14405 corp: 17/1275b lim: 105 exec/s: 0 rss: 68Mb L: 84/104 MS: 1 ChangeByte- 00:08:17.125 [2024-11-29 05:33:28.349799] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.349827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.349871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1099511627520 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.349886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.349938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.349953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.125 #36 NEW cov: 11894 ft: 14479 corp: 18/1349b lim: 105 exec/s: 0 rss: 68Mb L: 74/104 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:17.125 [2024-11-29 05:33:28.390047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.390075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.390113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.390128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.390183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.390199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.125 [2024-11-29 05:33:28.390251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.125 [2024-11-29 05:33:28.390266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.125 #37 NEW cov: 11894 ft: 14493 corp: 19/1433b lim: 105 exec/s: 0 rss: 68Mb L: 84/104 MS: 1 ChangeByte- 00:08:17.385 [2024-11-29 05:33:28.430194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070492585983 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.430222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.430264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.430279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.430336] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.430352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.430406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.430420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.385 #38 NEW cov: 11894 ft: 14503 corp: 20/1517b lim: 105 exec/s: 38 rss: 68Mb L: 84/104 MS: 1 ChangeByte- 00:08:17.385 [2024-11-29 05:33:28.470284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.470312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.470366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.470382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.470434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709289471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.470449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.470500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.470515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.385 #39 NEW cov: 11894 ft: 14513 corp: 21/1601b lim: 105 exec/s: 39 rss: 68Mb L: 84/104 MS: 1 ChangeBinInt- 00:08:17.385 [2024-11-29 05:33:28.510402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.510429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.510476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.510492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.510544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.510559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.510611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.510627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.385 #40 NEW cov: 11894 ft: 14542 corp: 22/1685b lim: 105 exec/s: 40 rss: 68Mb L: 84/104 MS: 1 ChangeByte- 00:08:17.385 [2024-11-29 05:33:28.550482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.550510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.550564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.550580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.550630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.550646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.550702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.550718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.385 #41 NEW cov: 11894 ft: 14561 corp: 23/1776b lim: 105 exec/s: 41 rss: 68Mb L: 91/104 MS: 1 ShuffleBytes- 00:08:17.385 [2024-11-29 05:33:28.590632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.590660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.590732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16777218 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.590747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.590798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.590812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.590865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.590880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.385 #42 NEW cov: 11894 ft: 14590 corp: 24/1876b lim: 105 exec/s: 42 rss: 68Mb L: 100/104 MS: 1 ChangeBinInt- 00:08:17.385 [2024-11-29 05:33:28.630626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582357504 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.630654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.630698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.630714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.630768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744069431361535 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.630783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.385 #43 NEW cov: 11894 ft: 14644 corp: 25/1943b lim: 105 exec/s: 43 rss: 68Mb L: 67/104 MS: 1 PersAutoDict- DE: "\001\000\000@"- 00:08:17.385 [2024-11-29 05:33:28.670767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743116787417855 len:65344 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.670796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.670835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.670851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.385 [2024-11-29 05:33:28.670902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.385 [2024-11-29 05:33:28.670917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.645 #48 NEW cov: 11894 ft: 14664 corp: 26/2017b lim: 105 exec/s: 48 rss: 68Mb L: 74/104 MS: 5 EraseBytes-CrossOver-ChangeByte-CopyPart-CrossOver- 00:08:17.645 [2024-11-29 05:33:28.721126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.721154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.721206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.721221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.721272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.721288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.721339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.721354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.721406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.721420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:17.645 #49 NEW cov: 11894 ft: 14749 corp: 27/2122b lim: 105 exec/s: 49 rss: 68Mb L: 105/105 MS: 1 InsertByte- 00:08:17.645 [2024-11-29 05:33:28.761143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.761171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.761236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.761252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.761306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.761321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.761375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.761390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.645 #50 NEW cov: 11894 ft: 14755 corp: 28/2206b lim: 105 exec/s: 50 rss: 68Mb L: 84/105 MS: 1 ShuffleBytes- 00:08:17.645 [2024-11-29 05:33:28.801223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.801250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.801298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.801313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.801367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.801398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.801452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.801466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.645 #51 NEW cov: 11894 ft: 14760 corp: 29/2301b lim: 105 exec/s: 51 rss: 69Mb L: 95/105 MS: 1 EraseBytes- 00:08:17.645 [2024-11-29 05:33:28.841344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.841372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.841420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.841435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.841487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.841504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.841555] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.841571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.645 #52 NEW cov: 11894 ft: 14761 corp: 30/2393b lim: 105 exec/s: 52 rss: 69Mb L: 92/105 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\004\000"- 00:08:17.645 [2024-11-29 05:33:28.881451] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.881478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.881529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.881545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.645 [2024-11-29 05:33:28.881594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.645 [2024-11-29 05:33:28.881614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.646 [2024-11-29 05:33:28.881674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.646 [2024-11-29 05:33:28.881689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.646 #53 NEW cov: 11894 ft: 14775 corp: 31/2477b lim: 105 exec/s: 53 rss: 69Mb L: 84/105 MS: 1 PersAutoDict- DE: "\001\000\000@"- 00:08:17.646 [2024-11-29 05:33:28.921607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.646 [2024-11-29 05:33:28.921635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.646 [2024-11-29 05:33:28.921683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.646 [2024-11-29 05:33:28.921699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.646 [2024-11-29 05:33:28.921751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.646 [2024-11-29 05:33:28.921766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.646 [2024-11-29 05:33:28.921819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.646 [2024-11-29 05:33:28.921835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.906 #54 NEW cov: 11894 ft: 14806 corp: 32/2576b lim: 105 exec/s: 54 rss: 69Mb L: 99/105 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:17.906 [2024-11-29 05:33:28.961489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:28.961517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:28.961580] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:28.961596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.906 #55 NEW cov: 11894 ft: 15089 corp: 33/2626b lim: 105 exec/s: 55 rss: 69Mb L: 50/105 MS: 1 CrossOver- 00:08:17.906 [2024-11-29 05:33:29.001702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582357504 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.001729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:29.001774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.001790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:29.001843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.001858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.906 #56 NEW cov: 11894 ft: 15098 corp: 34/2689b lim: 105 exec/s: 56 rss: 69Mb L: 63/105 MS: 1 ChangeBit- 00:08:17.906 [2024-11-29 05:33:29.041784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743116787417855 len:65344 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.041812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:29.041852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65283 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.041868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:29.041921] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.041935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.906 #57 NEW cov: 11894 ft: 15148 corp: 35/2769b lim: 105 exec/s: 57 rss: 69Mb L: 80/105 MS: 1 CopyPart- 00:08:17.906 [2024-11-29 05:33:29.081925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582438655 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.081953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:29.081995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.082011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:29.082064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18388478753530445578 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.082094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.906 #58 NEW cov: 11894 ft: 15164 corp: 36/2843b lim: 105 exec/s: 58 rss: 69Mb L: 74/105 MS: 1 InsertByte- 00:08:17.906 [2024-11-29 05:33:29.122049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.122078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:29.122118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.122134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:29.122185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.122200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.906 #59 NEW cov: 11894 ft: 15176 corp: 37/2914b lim: 105 exec/s: 59 rss: 69Mb L: 71/105 MS: 1 EraseBytes- 00:08:17.906 [2024-11-29 05:33:29.162304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.162331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:29.162396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.162413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:29.162466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.162482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.906 [2024-11-29 05:33:29.162540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.162555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.906 #60 NEW cov: 11894 ft: 15187 corp: 38/2998b lim: 105 exec/s: 60 rss: 69Mb L: 84/105 MS: 1 ShuffleBytes- 00:08:17.906 [2024-11-29 05:33:29.202077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.906 [2024-11-29 05:33:29.202104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.167 #62 NEW cov: 11894 ft: 15258 corp: 39/3023b lim: 105 exec/s: 62 rss: 69Mb L: 25/105 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:18.167 [2024-11-29 05:33:29.242247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599068480 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.242273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.242309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:47104 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.242324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.167 #66 NEW cov: 11894 ft: 15276 corp: 40/3080b lim: 105 exec/s: 66 rss: 69Mb L: 57/105 MS: 4 EraseBytes-InsertByte-EraseBytes-CrossOver- 00:08:18.167 [2024-11-29 05:33:29.282670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.282698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.282741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.282757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.282811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.282827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.282879] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.282893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.167 #67 NEW cov: 11894 ft: 15283 corp: 41/3164b lim: 105 exec/s: 67 rss: 69Mb L: 84/105 MS: 1 ChangeBit- 00:08:18.167 [2024-11-29 05:33:29.322796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.322823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.322870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446603336221196287 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.322886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.322939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.322957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.323009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.323024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.167 #68 NEW cov: 11894 ft: 15329 corp: 42/3248b lim: 105 exec/s: 68 rss: 69Mb L: 84/105 MS: 1 ChangeBit- 00:08:18.167 [2024-11-29 05:33:29.362763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743794704450560 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.362790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.362838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.362854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.362906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744069431361535 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.362921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.167 #69 NEW cov: 11894 ft: 15384 corp: 43/3315b lim: 105 exec/s: 69 rss: 70Mb L: 67/105 MS: 1 ChangeBit- 00:08:18.167 [2024-11-29 05:33:29.402898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435621375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.402924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.402972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.402988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.403037] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:288511846833455104 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.403053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.443002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069435581695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.443029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.443080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.443094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.167 [2024-11-29 05:33:29.443145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1126995123503104 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.167 [2024-11-29 05:33:29.443161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.167 #71 NEW cov: 11894 ft: 15441 corp: 44/3387b lim: 105 exec/s: 35 rss: 70Mb L: 72/105 MS: 2 EraseBytes-InsertByte- 00:08:18.167 #71 DONE cov: 11894 ft: 15441 corp: 44/3387b lim: 105 exec/s: 35 rss: 70Mb 00:08:18.167 ###### Recommended dictionary. ###### 00:08:18.167 "\001\000\000@" # Uses: 3 00:08:18.167 "\001\000\000\000\000\000\004\000" # Uses: 1 00:08:18.167 "\000\000\000\000" # Uses: 1 00:08:18.167 ###### End of recommended dictionary. ###### 00:08:18.167 Done 71 runs in 2 second(s) 00:08:18.427 05:33:29 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:18.427 05:33:29 -- ../common.sh@72 -- # (( i++ )) 00:08:18.427 05:33:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.427 05:33:29 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:18.427 05:33:29 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:18.427 05:33:29 -- nvmf/run.sh@24 -- # local timen=1 00:08:18.427 05:33:29 -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.427 05:33:29 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:18.427 05:33:29 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:18.427 05:33:29 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:18.427 05:33:29 -- nvmf/run.sh@29 -- # port=4417 00:08:18.427 05:33:29 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:18.427 05:33:29 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:18.427 05:33:29 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.427 05:33:29 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:18.427 [2024-11-29 05:33:29.618542] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:18.427 [2024-11-29 05:33:29.618618] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2221177 ] 00:08:18.427 EAL: No free 2048 kB hugepages reported on node 1 00:08:18.687 [2024-11-29 05:33:29.802945] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.687 [2024-11-29 05:33:29.822491] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:18.687 [2024-11-29 05:33:29.822618] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.687 [2024-11-29 05:33:29.873953] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.687 [2024-11-29 05:33:29.890292] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:18.687 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.687 INFO: Seed: 3219492938 00:08:18.687 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:18.687 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:18.687 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:18.687 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.687 #2 INITED exec/s: 0 rss: 59Mb 00:08:18.687 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.687 This may also happen if the target rejected all inputs we tried so far 00:08:18.687 [2024-11-29 05:33:29.935695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.687 [2024-11-29 05:33:29.935728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.687 [2024-11-29 05:33:29.935769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.687 [2024-11-29 05:33:29.935785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.687 [2024-11-29 05:33:29.935837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.687 [2024-11-29 05:33:29.935856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.946 NEW_FUNC[1/672]: 0x46b1b8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:18.946 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.946 #36 NEW cov: 11685 ft: 11682 corp: 2/73b lim: 120 exec/s: 0 rss: 67Mb L: 72/72 MS: 4 CopyPart-InsertRepeatedBytes-ChangeBinInt-InsertRepeatedBytes- 00:08:19.205 [2024-11-29 05:33:30.266404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.205 [2024-11-29 05:33:30.266446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.206 [2024-11-29 05:33:30.266501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.266518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.206 [2024-11-29 05:33:30.266569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.266583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.206 #40 NEW cov: 11801 ft: 12171 corp: 3/155b lim: 120 exec/s: 0 rss: 67Mb L: 82/82 MS: 4 ShuffleBytes-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:19.206 [2024-11-29 05:33:30.306545] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.306575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.206 [2024-11-29 05:33:30.306630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.306646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.206 #41 NEW cov: 11816 ft: 12817 corp: 4/226b lim: 120 exec/s: 0 rss: 67Mb L: 71/82 MS: 1 EraseBytes- 00:08:19.206 [2024-11-29 05:33:30.346542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.346569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.206 [2024-11-29 05:33:30.346611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.346626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.206 [2024-11-29 05:33:30.346679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.346694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.206 #42 NEW cov: 11901 ft: 13073 corp: 5/298b lim: 120 exec/s: 0 rss: 67Mb L: 72/82 MS: 1 ChangeBinInt- 00:08:19.206 [2024-11-29 05:33:30.396456] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.396483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.206 #43 NEW cov: 11901 ft: 14123 corp: 6/343b lim: 120 exec/s: 0 rss: 67Mb L: 45/82 MS: 1 EraseBytes- 00:08:19.206 [2024-11-29 05:33:30.436676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.436704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.206 [2024-11-29 05:33:30.436742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:281470681743360 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.436758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.206 #44 NEW cov: 11901 ft: 14213 corp: 7/395b lim: 120 exec/s: 0 rss: 67Mb L: 52/82 MS: 1 InsertRepeatedBytes- 00:08:19.206 [2024-11-29 05:33:30.476917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.476945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.206 [2024-11-29 05:33:30.476980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.476995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.206 [2024-11-29 05:33:30.477045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.206 [2024-11-29 05:33:30.477061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.206 #45 NEW cov: 11901 ft: 14259 corp: 8/477b lim: 120 exec/s: 0 rss: 67Mb L: 82/82 MS: 1 ShuffleBytes- 00:08:19.466 [2024-11-29 05:33:30.516853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.516882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.466 [2024-11-29 05:33:30.516924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.516939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.466 #46 NEW cov: 11901 ft: 14340 corp: 9/530b lim: 120 exec/s: 0 rss: 67Mb L: 53/82 MS: 1 CrossOver- 00:08:19.466 [2024-11-29 05:33:30.557134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.557162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.466 [2024-11-29 05:33:30.557214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.557228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.466 [2024-11-29 05:33:30.557281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407273337 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.557297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.466 #47 NEW cov: 11901 ft: 14355 corp: 10/612b lim: 120 exec/s: 0 rss: 67Mb L: 82/82 MS: 1 ChangeBit- 00:08:19.466 [2024-11-29 05:33:30.597111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.597142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.466 [2024-11-29 05:33:30.597196] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:281470681743360 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.597212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.466 #48 NEW cov: 11901 ft: 14387 corp: 11/664b lim: 120 exec/s: 0 rss: 67Mb L: 52/82 MS: 1 CopyPart- 00:08:19.466 [2024-11-29 05:33:30.637389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.637417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.466 [2024-11-29 05:33:30.637451] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.637466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.466 [2024-11-29 05:33:30.637521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:257 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.637536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.466 #49 NEW cov: 11901 ft: 14417 corp: 12/746b lim: 120 exec/s: 0 rss: 68Mb L: 82/82 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:19.466 [2024-11-29 05:33:30.677454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.677482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.466 [2024-11-29 05:33:30.677534] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.677550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.466 [2024-11-29 05:33:30.677604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:257 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.677620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.466 #50 NEW cov: 11901 ft: 14446 corp: 13/828b lim: 120 exec/s: 0 rss: 68Mb L: 82/82 MS: 1 ChangeByte- 00:08:19.466 [2024-11-29 05:33:30.717593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.717624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.466 [2024-11-29 05:33:30.717663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.717679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.466 [2024-11-29 05:33:30.717732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:4294901760 len:10 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.717747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.466 #51 NEW cov: 11901 ft: 14471 corp: 14/917b lim: 120 exec/s: 0 rss: 68Mb L: 89/89 MS: 1 CopyPart- 00:08:19.466 [2024-11-29 05:33:30.757570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.757603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.466 [2024-11-29 05:33:30.757668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.466 [2024-11-29 05:33:30.757684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.725 #52 NEW cov: 11901 ft: 14494 corp: 15/988b lim: 120 exec/s: 0 rss: 69Mb L: 71/89 MS: 1 ChangeBinInt- 00:08:19.725 [2024-11-29 05:33:30.797854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.725 [2024-11-29 05:33:30.797883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.725 [2024-11-29 05:33:30.797924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.725 [2024-11-29 05:33:30.797940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.725 [2024-11-29 05:33:30.797993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.725 [2024-11-29 05:33:30.798008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.725 #53 NEW cov: 11901 ft: 14513 corp: 16/1070b lim: 120 exec/s: 0 rss: 69Mb L: 82/89 MS: 1 ShuffleBytes- 00:08:19.725 [2024-11-29 05:33:30.838120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.725 [2024-11-29 05:33:30.838147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.725 [2024-11-29 05:33:30.838183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.725 [2024-11-29 05:33:30.838198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.725 [2024-11-29 05:33:30.838249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.725 [2024-11-29 05:33:30.838264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.726 [2024-11-29 05:33:30.838317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8748657313779906937 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:30.838332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.726 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:19.726 #54 NEW cov: 11924 ft: 14914 corp: 17/1179b lim: 120 exec/s: 0 rss: 69Mb L: 109/109 MS: 1 CopyPart- 00:08:19.726 [2024-11-29 05:33:30.888093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:30.888120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.726 [2024-11-29 05:33:30.888157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:30.888171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.726 [2024-11-29 05:33:30.888225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:30.888240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.726 #55 NEW cov: 11924 ft: 14926 corp: 18/1254b lim: 120 exec/s: 0 rss: 69Mb L: 75/109 MS: 1 EraseBytes- 00:08:19.726 [2024-11-29 05:33:30.918210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:30.918238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.726 [2024-11-29 05:33:30.918290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:30.918306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.726 [2024-11-29 05:33:30.918356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:30.918372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.726 #56 NEW cov: 11924 ft: 14994 corp: 19/1336b lim: 120 exec/s: 56 rss: 69Mb L: 82/109 MS: 1 ChangeBit- 00:08:19.726 [2024-11-29 05:33:30.958503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:30.958530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.726 [2024-11-29 05:33:30.958572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:30.958589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.726 [2024-11-29 05:33:30.958644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:30.958660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.726 [2024-11-29 05:33:30.958710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8748657313779906937 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:30.958726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.726 #57 NEW cov: 11924 ft: 15006 corp: 20/1454b lim: 120 exec/s: 57 rss: 69Mb L: 118/118 MS: 1 InsertRepeatedBytes- 00:08:19.726 [2024-11-29 05:33:31.008579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:31.008610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.726 [2024-11-29 05:33:31.008661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:31.008676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.726 [2024-11-29 05:33:31.008726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160911369273465 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:31.008742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.726 [2024-11-29 05:33:31.008797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.726 [2024-11-29 05:33:31.008812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.985 #58 NEW cov: 11924 ft: 15053 corp: 21/1569b lim: 120 exec/s: 58 rss: 69Mb L: 115/118 MS: 1 InsertRepeatedBytes- 00:08:19.985 [2024-11-29 05:33:31.048698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:168458617 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.048725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.048773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.048788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.048839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.048870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.048920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.048935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.985 #59 NEW cov: 11924 ft: 15093 corp: 22/1675b lim: 120 exec/s: 59 rss: 69Mb L: 106/118 MS: 1 CrossOver- 00:08:19.985 [2024-11-29 05:33:31.088807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.088834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.088880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.088895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.088945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.088959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.089009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8748657313779906937 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.089024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.985 #60 NEW cov: 11924 ft: 15107 corp: 23/1793b lim: 120 exec/s: 60 rss: 69Mb L: 118/118 MS: 1 CrossOver- 00:08:19.985 [2024-11-29 05:33:31.128944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.128971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.129017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.129032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.129086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.129101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.129151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8748657313779906937 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.129165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.985 #61 NEW cov: 11924 ft: 15135 corp: 24/1911b lim: 120 exec/s: 61 rss: 69Mb L: 118/118 MS: 1 ChangeByte- 00:08:19.985 [2024-11-29 05:33:31.168905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.168934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.168970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:281470681743360 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.168985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.169036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2304 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.169051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.985 #62 NEW cov: 11924 ft: 15174 corp: 25/1995b lim: 120 exec/s: 62 rss: 69Mb L: 84/118 MS: 1 CrossOver- 00:08:19.985 [2024-11-29 05:33:31.209162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.209188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.209236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.209251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.209300] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.209314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.209364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8748657313779906873 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.209379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.985 #63 NEW cov: 11924 ft: 15185 corp: 26/2113b lim: 120 exec/s: 63 rss: 69Mb L: 118/118 MS: 1 ChangeBit- 00:08:19.985 [2024-11-29 05:33:31.249023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.249051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.985 [2024-11-29 05:33:31.249086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1095216660480 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.985 [2024-11-29 05:33:31.249101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.985 #64 NEW cov: 11924 ft: 15199 corp: 27/2166b lim: 120 exec/s: 64 rss: 69Mb L: 53/118 MS: 1 InsertByte- 00:08:20.245 [2024-11-29 05:33:31.289153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.245 [2024-11-29 05:33:31.289181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.245 [2024-11-29 05:33:31.289230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.245 [2024-11-29 05:33:31.289246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.245 #65 NEW cov: 11924 ft: 15200 corp: 28/2219b lim: 120 exec/s: 65 rss: 70Mb L: 53/118 MS: 1 CMP- DE: "\317'\010\002\000\000\000\000"- 00:08:20.245 [2024-11-29 05:33:31.329249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.245 [2024-11-29 05:33:31.329276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.245 [2024-11-29 05:33:31.329317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.245 [2024-11-29 05:33:31.329332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.245 #66 NEW cov: 11924 ft: 15213 corp: 29/2272b lim: 120 exec/s: 66 rss: 70Mb L: 53/118 MS: 1 ShuffleBytes- 00:08:20.245 [2024-11-29 05:33:31.369357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.245 [2024-11-29 05:33:31.369384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.245 [2024-11-29 05:33:31.369434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.245 [2024-11-29 05:33:31.369449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.245 #67 NEW cov: 11924 ft: 15221 corp: 30/2325b lim: 120 exec/s: 67 rss: 70Mb L: 53/118 MS: 1 ShuffleBytes- 00:08:20.245 [2024-11-29 05:33:31.409594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.245 [2024-11-29 05:33:31.409624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.246 [2024-11-29 05:33:31.409666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.246 [2024-11-29 05:33:31.409682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.246 [2024-11-29 05:33:31.409730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.246 [2024-11-29 05:33:31.409745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.246 #68 NEW cov: 11924 ft: 15252 corp: 31/2407b lim: 120 exec/s: 68 rss: 70Mb L: 82/118 MS: 1 ShuffleBytes- 00:08:20.246 [2024-11-29 05:33:31.449602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.246 [2024-11-29 05:33:31.449629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.246 [2024-11-29 05:33:31.449666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.246 [2024-11-29 05:33:31.449681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.246 #69 NEW cov: 11924 ft: 15278 corp: 32/2460b lim: 120 exec/s: 69 rss: 70Mb L: 53/118 MS: 1 ShuffleBytes- 00:08:20.246 [2024-11-29 05:33:31.489806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.246 [2024-11-29 05:33:31.489833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.246 [2024-11-29 05:33:31.489867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.246 [2024-11-29 05:33:31.489883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.246 [2024-11-29 05:33:31.489933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.246 [2024-11-29 05:33:31.489947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.246 #70 NEW cov: 11924 ft: 15284 corp: 33/2532b lim: 120 exec/s: 70 rss: 70Mb L: 72/118 MS: 1 ChangeBit- 00:08:20.246 [2024-11-29 05:33:31.529949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.246 [2024-11-29 05:33:31.529977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.246 [2024-11-29 05:33:31.530014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.246 [2024-11-29 05:33:31.530029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.246 [2024-11-29 05:33:31.530082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:9984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.246 [2024-11-29 05:33:31.530097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.506 #71 NEW cov: 11924 ft: 15293 corp: 34/2605b lim: 120 exec/s: 71 rss: 70Mb L: 73/118 MS: 1 InsertByte- 00:08:20.506 [2024-11-29 05:33:31.570173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.570201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.570245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.570260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.570307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.570321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.570369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8748657313779906937 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.570383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.506 #72 NEW cov: 11924 ft: 15303 corp: 35/2723b lim: 120 exec/s: 72 rss: 70Mb L: 118/118 MS: 1 ChangeBit- 00:08:20.506 [2024-11-29 05:33:31.609990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.610018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.610088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:39582418599936 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.610104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.506 #73 NEW cov: 11924 ft: 15320 corp: 36/2777b lim: 120 exec/s: 73 rss: 70Mb L: 54/118 MS: 1 InsertByte- 00:08:20.506 [2024-11-29 05:33:31.650117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.650144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.650191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:5377 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.650206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.506 #74 NEW cov: 11924 ft: 15401 corp: 37/2831b lim: 120 exec/s: 74 rss: 70Mb L: 54/118 MS: 1 InsertByte- 00:08:20.506 [2024-11-29 05:33:31.690524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.690552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.690588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.690609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.690662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.690676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.690727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8753143321221233017 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.690741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.506 #75 NEW cov: 11924 ft: 15443 corp: 38/2950b lim: 120 exec/s: 75 rss: 70Mb L: 119/119 MS: 1 CopyPart- 00:08:20.506 [2024-11-29 05:33:31.730368] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.730395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.730431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:10660389599274177455 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.730446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.506 #76 NEW cov: 11924 ft: 15450 corp: 39/3010b lim: 120 exec/s: 76 rss: 70Mb L: 60/119 MS: 1 CMP- DE: "\001\223\257\223\361O\334\272"- 00:08:20.506 [2024-11-29 05:33:31.770622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731963 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.770650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.770707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.770729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.770792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.770807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.800723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731963 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.800751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.800804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.800820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.506 [2024-11-29 05:33:31.800871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.506 [2024-11-29 05:33:31.800886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.766 #78 NEW cov: 11924 ft: 15455 corp: 40/3092b lim: 120 exec/s: 78 rss: 70Mb L: 82/119 MS: 2 ChangeBit-CrossOver- 00:08:20.766 [2024-11-29 05:33:31.830794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10660389599274177455 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.766 [2024-11-29 05:33:31.830822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.766 [2024-11-29 05:33:31.830855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.766 [2024-11-29 05:33:31.830871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.766 [2024-11-29 05:33:31.830920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:9984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.766 [2024-11-29 05:33:31.830935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.766 #79 NEW cov: 11924 ft: 15476 corp: 41/3165b lim: 120 exec/s: 79 rss: 70Mb L: 73/119 MS: 1 PersAutoDict- DE: "\001\223\257\223\361O\334\272"- 00:08:20.766 [2024-11-29 05:33:31.870928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:8753160911537731961 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.766 [2024-11-29 05:33:31.870955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.766 [2024-11-29 05:33:31.871010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.766 [2024-11-29 05:33:31.871026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.766 [2024-11-29 05:33:31.871087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.766 [2024-11-29 05:33:31.871118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.766 #80 NEW cov: 11924 ft: 15488 corp: 42/3256b lim: 120 exec/s: 80 rss: 70Mb L: 91/119 MS: 1 EraseBytes- 00:08:20.766 [2024-11-29 05:33:31.910869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.766 [2024-11-29 05:33:31.910901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.766 [2024-11-29 05:33:31.910950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:281470681743360 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.766 [2024-11-29 05:33:31.910967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.766 #81 NEW cov: 11924 ft: 15527 corp: 43/3309b lim: 120 exec/s: 40 rss: 70Mb L: 53/119 MS: 1 InsertByte- 00:08:20.766 #81 DONE cov: 11924 ft: 15527 corp: 43/3309b lim: 120 exec/s: 40 rss: 70Mb 00:08:20.766 ###### Recommended dictionary. ###### 00:08:20.766 "\001\000\000\000" # Uses: 0 00:08:20.766 "\317'\010\002\000\000\000\000" # Uses: 0 00:08:20.766 "\001\223\257\223\361O\334\272" # Uses: 1 00:08:20.766 ###### End of recommended dictionary. ###### 00:08:20.766 Done 81 runs in 2 second(s) 00:08:20.766 05:33:32 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:20.766 05:33:32 -- ../common.sh@72 -- # (( i++ )) 00:08:20.766 05:33:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.766 05:33:32 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:20.766 05:33:32 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:20.766 05:33:32 -- nvmf/run.sh@24 -- # local timen=1 00:08:20.766 05:33:32 -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.766 05:33:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:20.766 05:33:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:20.766 05:33:32 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:20.766 05:33:32 -- nvmf/run.sh@29 -- # port=4418 00:08:20.766 05:33:32 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:20.766 05:33:32 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:20.766 05:33:32 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.766 05:33:32 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:21.026 [2024-11-29 05:33:32.085396] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:21.026 [2024-11-29 05:33:32.085467] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2221715 ] 00:08:21.026 EAL: No free 2048 kB hugepages reported on node 1 00:08:21.026 [2024-11-29 05:33:32.258829] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.026 [2024-11-29 05:33:32.278916] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:21.026 [2024-11-29 05:33:32.279055] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.285 [2024-11-29 05:33:32.330309] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:21.285 [2024-11-29 05:33:32.346658] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:21.285 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.285 INFO: Seed: 1379527552 00:08:21.285 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:21.285 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:21.285 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:21.285 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.285 #2 INITED exec/s: 0 rss: 59Mb 00:08:21.285 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.285 This may also happen if the target rejected all inputs we tried so far 00:08:21.285 [2024-11-29 05:33:32.391711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.285 [2024-11-29 05:33:32.391744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.544 NEW_FUNC[1/670]: 0x46ea18 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:21.544 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.544 #20 NEW cov: 11632 ft: 11633 corp: 2/30b lim: 100 exec/s: 0 rss: 67Mb L: 29/29 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:21.544 [2024-11-29 05:33:32.692417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.544 [2024-11-29 05:33:32.692448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.544 #26 NEW cov: 11745 ft: 12059 corp: 3/59b lim: 100 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ChangeBit- 00:08:21.544 [2024-11-29 05:33:32.732580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.544 [2024-11-29 05:33:32.732612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.544 [2024-11-29 05:33:32.732649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:21.544 [2024-11-29 05:33:32.732664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.544 #27 NEW cov: 11751 ft: 12735 corp: 4/107b lim: 100 exec/s: 0 rss: 67Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:21.544 [2024-11-29 05:33:32.772602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.544 [2024-11-29 05:33:32.772628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.544 #28 NEW cov: 11836 ft: 13032 corp: 5/137b lim: 100 exec/s: 0 rss: 67Mb L: 30/48 MS: 1 CrossOver- 00:08:21.544 [2024-11-29 05:33:32.812715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.544 [2024-11-29 05:33:32.812741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.545 #34 NEW cov: 11836 ft: 13158 corp: 6/168b lim: 100 exec/s: 0 rss: 67Mb L: 31/48 MS: 1 InsertByte- 00:08:21.804 [2024-11-29 05:33:32.852843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.805 [2024-11-29 05:33:32.852870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.805 #40 NEW cov: 11836 ft: 13218 corp: 7/193b lim: 100 exec/s: 0 rss: 67Mb L: 25/48 MS: 1 EraseBytes- 00:08:21.805 [2024-11-29 05:33:32.892943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.805 [2024-11-29 05:33:32.892969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.805 #46 NEW cov: 11836 ft: 13377 corp: 8/224b lim: 100 exec/s: 0 rss: 67Mb L: 31/48 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:21.805 [2024-11-29 05:33:32.933082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.805 [2024-11-29 05:33:32.933109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.805 #47 NEW cov: 11836 ft: 13409 corp: 9/255b lim: 100 exec/s: 0 rss: 67Mb L: 31/48 MS: 1 ShuffleBytes- 00:08:21.805 [2024-11-29 05:33:32.973410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.805 [2024-11-29 05:33:32.973437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.805 [2024-11-29 05:33:32.973492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:21.805 [2024-11-29 05:33:32.973510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.805 [2024-11-29 05:33:32.973564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:21.805 [2024-11-29 05:33:32.973578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.805 #58 NEW cov: 11836 ft: 13743 corp: 10/319b lim: 100 exec/s: 0 rss: 67Mb L: 64/64 MS: 1 InsertRepeatedBytes- 00:08:21.805 [2024-11-29 05:33:33.013304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.805 [2024-11-29 05:33:33.013331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.805 #59 NEW cov: 11836 ft: 13772 corp: 11/348b lim: 100 exec/s: 0 rss: 67Mb L: 29/64 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:21.805 [2024-11-29 05:33:33.043431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.805 [2024-11-29 05:33:33.043457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.805 #60 NEW cov: 11836 ft: 13831 corp: 12/379b lim: 100 exec/s: 0 rss: 67Mb L: 31/64 MS: 1 ChangeBinInt- 00:08:21.805 [2024-11-29 05:33:33.073509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:21.805 [2024-11-29 05:33:33.073536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.805 #61 NEW cov: 11836 ft: 13838 corp: 13/411b lim: 100 exec/s: 0 rss: 67Mb L: 32/64 MS: 1 InsertByte- 00:08:22.065 [2024-11-29 05:33:33.113882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.065 [2024-11-29 05:33:33.113909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.065 [2024-11-29 05:33:33.113983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.065 [2024-11-29 05:33:33.113997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.065 [2024-11-29 05:33:33.114053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:22.065 [2024-11-29 05:33:33.114067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.065 #62 NEW cov: 11836 ft: 13862 corp: 14/485b lim: 100 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 InsertRepeatedBytes- 00:08:22.065 [2024-11-29 05:33:33.153745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.065 [2024-11-29 05:33:33.153772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.065 #63 NEW cov: 11836 ft: 13925 corp: 15/521b lim: 100 exec/s: 0 rss: 67Mb L: 36/74 MS: 1 InsertRepeatedBytes- 00:08:22.065 [2024-11-29 05:33:33.193883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.065 [2024-11-29 05:33:33.193909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.065 #64 NEW cov: 11836 ft: 13962 corp: 16/553b lim: 100 exec/s: 0 rss: 68Mb L: 32/74 MS: 1 InsertByte- 00:08:22.065 [2024-11-29 05:33:33.234008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.065 [2024-11-29 05:33:33.234036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.065 #65 NEW cov: 11836 ft: 13994 corp: 17/586b lim: 100 exec/s: 0 rss: 68Mb L: 33/74 MS: 1 InsertByte- 00:08:22.065 [2024-11-29 05:33:33.274481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.065 [2024-11-29 05:33:33.274511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.065 [2024-11-29 05:33:33.274563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.065 [2024-11-29 05:33:33.274577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.065 [2024-11-29 05:33:33.274633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:22.065 [2024-11-29 05:33:33.274649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.065 [2024-11-29 05:33:33.274702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:22.065 [2024-11-29 05:33:33.274716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.065 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:22.065 #66 NEW cov: 11859 ft: 14345 corp: 18/667b lim: 100 exec/s: 0 rss: 68Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:08:22.065 [2024-11-29 05:33:33.314560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.065 [2024-11-29 05:33:33.314586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.065 [2024-11-29 05:33:33.314642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.065 [2024-11-29 05:33:33.314658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.065 [2024-11-29 05:33:33.314709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:22.065 [2024-11-29 05:33:33.314723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.065 [2024-11-29 05:33:33.314778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:22.065 [2024-11-29 05:33:33.314792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.065 #67 NEW cov: 11859 ft: 14386 corp: 19/764b lim: 100 exec/s: 0 rss: 68Mb L: 97/97 MS: 1 CrossOver- 00:08:22.065 [2024-11-29 05:33:33.354353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.065 [2024-11-29 05:33:33.354380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.325 #68 NEW cov: 11859 ft: 14410 corp: 20/789b lim: 100 exec/s: 68 rss: 68Mb L: 25/97 MS: 1 ChangeByte- 00:08:22.325 [2024-11-29 05:33:33.394863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.325 [2024-11-29 05:33:33.394890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.325 [2024-11-29 05:33:33.394939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.325 [2024-11-29 05:33:33.394953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.325 [2024-11-29 05:33:33.395006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:22.325 [2024-11-29 05:33:33.395021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.325 [2024-11-29 05:33:33.395074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:22.325 [2024-11-29 05:33:33.395088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.325 #69 NEW cov: 11859 ft: 14465 corp: 21/886b lim: 100 exec/s: 69 rss: 68Mb L: 97/97 MS: 1 CopyPart- 00:08:22.325 [2024-11-29 05:33:33.434596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.325 [2024-11-29 05:33:33.434627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.325 #70 NEW cov: 11859 ft: 14484 corp: 22/915b lim: 100 exec/s: 70 rss: 68Mb L: 29/97 MS: 1 CMP- DE: "\377\001\000\000"- 00:08:22.325 [2024-11-29 05:33:33.474701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.325 [2024-11-29 05:33:33.474728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.325 #71 NEW cov: 11859 ft: 14488 corp: 23/947b lim: 100 exec/s: 71 rss: 68Mb L: 32/97 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:22.325 [2024-11-29 05:33:33.504769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.325 [2024-11-29 05:33:33.504795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.325 #72 NEW cov: 11859 ft: 14503 corp: 24/977b lim: 100 exec/s: 72 rss: 68Mb L: 30/97 MS: 1 InsertByte- 00:08:22.325 [2024-11-29 05:33:33.544984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.325 [2024-11-29 05:33:33.545011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.325 [2024-11-29 05:33:33.545068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.325 [2024-11-29 05:33:33.545084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.325 #73 NEW cov: 11859 ft: 14561 corp: 25/1024b lim: 100 exec/s: 73 rss: 68Mb L: 47/97 MS: 1 CrossOver- 00:08:22.325 [2024-11-29 05:33:33.585039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.325 [2024-11-29 05:33:33.585065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.325 #74 NEW cov: 11859 ft: 14576 corp: 26/1053b lim: 100 exec/s: 74 rss: 68Mb L: 29/97 MS: 1 ChangeByte- 00:08:22.325 [2024-11-29 05:33:33.625148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.325 [2024-11-29 05:33:33.625174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.585 #75 NEW cov: 11859 ft: 14697 corp: 27/1077b lim: 100 exec/s: 75 rss: 68Mb L: 24/97 MS: 1 EraseBytes- 00:08:22.585 [2024-11-29 05:33:33.665485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.585 [2024-11-29 05:33:33.665512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.585 [2024-11-29 05:33:33.665547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.585 [2024-11-29 05:33:33.665562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.585 [2024-11-29 05:33:33.665623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:22.585 [2024-11-29 05:33:33.665638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.585 #76 NEW cov: 11859 ft: 14761 corp: 28/1151b lim: 100 exec/s: 76 rss: 68Mb L: 74/97 MS: 1 ChangeBit- 00:08:22.585 [2024-11-29 05:33:33.705368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.585 [2024-11-29 05:33:33.705394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.585 #77 NEW cov: 11859 ft: 14772 corp: 29/1182b lim: 100 exec/s: 77 rss: 68Mb L: 31/97 MS: 1 ChangeBit- 00:08:22.585 [2024-11-29 05:33:33.735592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.585 [2024-11-29 05:33:33.735624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.585 [2024-11-29 05:33:33.735672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.585 [2024-11-29 05:33:33.735687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.585 #78 NEW cov: 11859 ft: 14781 corp: 30/1228b lim: 100 exec/s: 78 rss: 68Mb L: 46/97 MS: 1 InsertRepeatedBytes- 00:08:22.585 [2024-11-29 05:33:33.775636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.585 [2024-11-29 05:33:33.775662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.585 #79 NEW cov: 11859 ft: 14793 corp: 31/1260b lim: 100 exec/s: 79 rss: 68Mb L: 32/97 MS: 1 ChangeBit- 00:08:22.585 [2024-11-29 05:33:33.805682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.585 [2024-11-29 05:33:33.805708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.585 #80 NEW cov: 11859 ft: 14807 corp: 32/1292b lim: 100 exec/s: 80 rss: 68Mb L: 32/97 MS: 1 InsertByte- 00:08:22.585 [2024-11-29 05:33:33.835754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.585 [2024-11-29 05:33:33.835780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.585 #81 NEW cov: 11859 ft: 14812 corp: 33/1323b lim: 100 exec/s: 81 rss: 69Mb L: 31/97 MS: 1 CrossOver- 00:08:22.585 [2024-11-29 05:33:33.875884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.585 [2024-11-29 05:33:33.875912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.845 #82 NEW cov: 11859 ft: 14848 corp: 34/1355b lim: 100 exec/s: 82 rss: 69Mb L: 32/97 MS: 1 InsertByte- 00:08:22.845 [2024-11-29 05:33:33.915992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.845 [2024-11-29 05:33:33.916019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.845 #83 NEW cov: 11859 ft: 14858 corp: 35/1388b lim: 100 exec/s: 83 rss: 69Mb L: 33/97 MS: 1 ChangeBinInt- 00:08:22.845 [2024-11-29 05:33:33.956131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.845 [2024-11-29 05:33:33.956157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.845 #84 NEW cov: 11859 ft: 14881 corp: 36/1419b lim: 100 exec/s: 84 rss: 69Mb L: 31/97 MS: 1 ShuffleBytes- 00:08:22.845 [2024-11-29 05:33:33.986256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.845 [2024-11-29 05:33:33.986283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.845 #85 NEW cov: 11859 ft: 14886 corp: 37/1450b lim: 100 exec/s: 85 rss: 69Mb L: 31/97 MS: 1 InsertByte- 00:08:22.845 [2024-11-29 05:33:34.026556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.845 [2024-11-29 05:33:34.026582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.845 [2024-11-29 05:33:34.026653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.845 [2024-11-29 05:33:34.026668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.845 [2024-11-29 05:33:34.026721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:22.845 [2024-11-29 05:33:34.026739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.845 #86 NEW cov: 11859 ft: 14920 corp: 38/1524b lim: 100 exec/s: 86 rss: 69Mb L: 74/97 MS: 1 ChangeBit- 00:08:22.845 [2024-11-29 05:33:34.066795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.845 [2024-11-29 05:33:34.066822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.845 [2024-11-29 05:33:34.066888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.845 [2024-11-29 05:33:34.066903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.845 [2024-11-29 05:33:34.066956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:22.845 [2024-11-29 05:33:34.066971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.845 [2024-11-29 05:33:34.067022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:22.845 [2024-11-29 05:33:34.067035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.845 #87 NEW cov: 11859 ft: 14956 corp: 39/1605b lim: 100 exec/s: 87 rss: 69Mb L: 81/97 MS: 1 CrossOver- 00:08:22.845 [2024-11-29 05:33:34.106578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.845 [2024-11-29 05:33:34.106608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.845 #88 NEW cov: 11859 ft: 14969 corp: 40/1636b lim: 100 exec/s: 88 rss: 69Mb L: 31/97 MS: 1 CMP- DE: "?\000\000\000\000\000\000\000"- 00:08:22.845 [2024-11-29 05:33:34.146737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.845 [2024-11-29 05:33:34.146764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.105 #89 NEW cov: 11859 ft: 14971 corp: 41/1669b lim: 100 exec/s: 89 rss: 69Mb L: 33/97 MS: 1 InsertByte- 00:08:23.105 [2024-11-29 05:33:34.186925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.105 [2024-11-29 05:33:34.186951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.105 [2024-11-29 05:33:34.186985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.105 [2024-11-29 05:33:34.186999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.105 #90 NEW cov: 11859 ft: 15006 corp: 42/1714b lim: 100 exec/s: 90 rss: 69Mb L: 45/97 MS: 1 CopyPart- 00:08:23.105 [2024-11-29 05:33:34.226950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.105 [2024-11-29 05:33:34.226977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.105 #91 NEW cov: 11859 ft: 15015 corp: 43/1743b lim: 100 exec/s: 91 rss: 69Mb L: 29/97 MS: 1 ChangeBit- 00:08:23.105 [2024-11-29 05:33:34.267002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.105 [2024-11-29 05:33:34.267029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.105 #92 NEW cov: 11859 ft: 15054 corp: 44/1775b lim: 100 exec/s: 92 rss: 69Mb L: 32/97 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:23.105 [2024-11-29 05:33:34.307473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.105 [2024-11-29 05:33:34.307500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.105 [2024-11-29 05:33:34.307545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.105 [2024-11-29 05:33:34.307560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.105 [2024-11-29 05:33:34.307613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.105 [2024-11-29 05:33:34.307627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.105 [2024-11-29 05:33:34.307697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.105 [2024-11-29 05:33:34.307711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.105 #93 NEW cov: 11859 ft: 15095 corp: 45/1872b lim: 100 exec/s: 93 rss: 69Mb L: 97/97 MS: 1 CrossOver- 00:08:23.105 [2024-11-29 05:33:34.347584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.105 [2024-11-29 05:33:34.347616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.105 [2024-11-29 05:33:34.347670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.105 [2024-11-29 05:33:34.347683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.105 [2024-11-29 05:33:34.347735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.105 [2024-11-29 05:33:34.347749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.105 [2024-11-29 05:33:34.347802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.105 [2024-11-29 05:33:34.347812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.105 #94 NEW cov: 11859 ft: 15101 corp: 46/1969b lim: 100 exec/s: 94 rss: 69Mb L: 97/97 MS: 1 ShuffleBytes- 00:08:23.105 [2024-11-29 05:33:34.387527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.106 [2024-11-29 05:33:34.387553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.106 [2024-11-29 05:33:34.387591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.106 [2024-11-29 05:33:34.387610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.365 #95 NEW cov: 11859 ft: 15104 corp: 47/2017b lim: 100 exec/s: 47 rss: 70Mb L: 48/97 MS: 1 CrossOver- 00:08:23.365 #95 DONE cov: 11859 ft: 15104 corp: 47/2017b lim: 100 exec/s: 47 rss: 70Mb 00:08:23.365 ###### Recommended dictionary. ###### 00:08:23.365 "\377\377\377\377\377\377\377\377" # Uses: 5 00:08:23.365 "\377\001\000\000" # Uses: 0 00:08:23.365 "?\000\000\000\000\000\000\000" # Uses: 0 00:08:23.365 ###### End of recommended dictionary. ###### 00:08:23.365 Done 95 runs in 2 second(s) 00:08:23.365 05:33:34 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:23.365 05:33:34 -- ../common.sh@72 -- # (( i++ )) 00:08:23.365 05:33:34 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.365 05:33:34 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:23.365 05:33:34 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:23.365 05:33:34 -- nvmf/run.sh@24 -- # local timen=1 00:08:23.365 05:33:34 -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.365 05:33:34 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:23.365 05:33:34 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:23.365 05:33:34 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:23.365 05:33:34 -- nvmf/run.sh@29 -- # port=4419 00:08:23.365 05:33:34 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:23.365 05:33:34 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:23.365 05:33:34 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.365 05:33:34 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:23.365 [2024-11-29 05:33:34.563908] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:23.365 [2024-11-29 05:33:34.563977] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2222012 ] 00:08:23.365 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.625 [2024-11-29 05:33:34.749292] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.625 [2024-11-29 05:33:34.768812] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:23.625 [2024-11-29 05:33:34.768951] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.625 [2024-11-29 05:33:34.820304] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.625 [2024-11-29 05:33:34.836684] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:23.625 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.625 INFO: Seed: 3870531824 00:08:23.625 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:23.625 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:23.625 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:23.625 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.625 #2 INITED exec/s: 0 rss: 59Mb 00:08:23.625 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.625 This may also happen if the target rejected all inputs we tried so far 00:08:23.625 [2024-11-29 05:33:34.881854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:42663 00:08:23.625 [2024-11-29 05:33:34.881886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.625 [2024-11-29 05:33:34.881925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:23.625 [2024-11-29 05:33:34.881940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.625 [2024-11-29 05:33:34.881987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468691120727718 len:42663 00:08:23.625 [2024-11-29 05:33:34.882003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.884 NEW_FUNC[1/670]: 0x4719d8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:23.884 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.884 #22 NEW cov: 11602 ft: 11611 corp: 2/38b lim: 50 exec/s: 0 rss: 67Mb L: 37/37 MS: 5 CrossOver-ShuffleBytes-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:23.884 [2024-11-29 05:33:35.182748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:42663 00:08:23.884 [2024-11-29 05:33:35.182782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.884 [2024-11-29 05:33:35.182815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:23.884 [2024-11-29 05:33:35.182834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.884 [2024-11-29 05:33:35.182883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1302123113578145298 len:4627 00:08:23.884 [2024-11-29 05:33:35.182897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.884 [2024-11-29 05:33:35.182949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12008468688627962534 len:42663 00:08:23.884 [2024-11-29 05:33:35.182964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.151 #23 NEW cov: 11723 ft: 12324 corp: 3/85b lim: 50 exec/s: 0 rss: 67Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:24.151 [2024-11-29 05:33:35.232662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691137504934 len:42663 00:08:24.151 [2024-11-29 05:33:35.232690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.151 [2024-11-29 05:33:35.232741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:24.151 [2024-11-29 05:33:35.232755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.151 [2024-11-29 05:33:35.232806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468691120727718 len:42663 00:08:24.151 [2024-11-29 05:33:35.232821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.151 #34 NEW cov: 11729 ft: 12588 corp: 4/122b lim: 50 exec/s: 0 rss: 67Mb L: 37/47 MS: 1 ChangeBit- 00:08:24.151 [2024-11-29 05:33:35.272834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:24.151 [2024-11-29 05:33:35.272865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.151 [2024-11-29 05:33:35.272902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:24.152 [2024-11-29 05:33:35.272918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.152 [2024-11-29 05:33:35.272967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:24.152 [2024-11-29 05:33:35.272983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.152 #35 NEW cov: 11814 ft: 12833 corp: 5/153b lim: 50 exec/s: 0 rss: 67Mb L: 31/47 MS: 1 InsertRepeatedBytes- 00:08:24.152 [2024-11-29 05:33:35.312825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:24.152 [2024-11-29 05:33:35.312853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.152 [2024-11-29 05:33:35.312906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:24.152 [2024-11-29 05:33:35.312922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.152 #36 NEW cov: 11814 ft: 13210 corp: 6/182b lim: 50 exec/s: 0 rss: 67Mb L: 29/47 MS: 1 EraseBytes- 00:08:24.152 [2024-11-29 05:33:35.353046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691070396070 len:42663 00:08:24.152 [2024-11-29 05:33:35.353074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.152 [2024-11-29 05:33:35.353118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:24.152 [2024-11-29 05:33:35.353133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.152 [2024-11-29 05:33:35.353183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468691120727718 len:42663 00:08:24.152 [2024-11-29 05:33:35.353198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.152 #37 NEW cov: 11814 ft: 13322 corp: 7/219b lim: 50 exec/s: 0 rss: 67Mb L: 37/47 MS: 1 ChangeBit- 00:08:24.152 [2024-11-29 05:33:35.393177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691137504934 len:42663 00:08:24.152 [2024-11-29 05:33:35.393204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.152 [2024-11-29 05:33:35.393239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:24.152 [2024-11-29 05:33:35.393254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.152 [2024-11-29 05:33:35.393304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468691120727788 len:42663 00:08:24.152 [2024-11-29 05:33:35.393319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.152 #38 NEW cov: 11814 ft: 13460 corp: 8/257b lim: 50 exec/s: 0 rss: 67Mb L: 38/47 MS: 1 InsertByte- 00:08:24.152 [2024-11-29 05:33:35.433244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:168427520 len:1 00:08:24.152 [2024-11-29 05:33:35.433271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.152 [2024-11-29 05:33:35.433306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:24.152 [2024-11-29 05:33:35.433320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.152 [2024-11-29 05:33:35.433373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:24.152 [2024-11-29 05:33:35.433389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.452 #43 NEW cov: 11814 ft: 13503 corp: 9/289b lim: 50 exec/s: 0 rss: 67Mb L: 32/47 MS: 5 InsertByte-CrossOver-CrossOver-CopyPart-CrossOver- 00:08:24.452 [2024-11-29 05:33:35.463343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:168427520 len:1 00:08:24.452 [2024-11-29 05:33:35.463371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.452 [2024-11-29 05:33:35.463408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:24.452 [2024-11-29 05:33:35.463422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.452 [2024-11-29 05:33:35.463475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:24.452 [2024-11-29 05:33:35.463491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.452 #44 NEW cov: 11814 ft: 13617 corp: 10/321b lim: 50 exec/s: 0 rss: 67Mb L: 32/47 MS: 1 CopyPart- 00:08:24.452 [2024-11-29 05:33:35.503609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:42663 00:08:24.452 [2024-11-29 05:33:35.503639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.452 [2024-11-29 05:33:35.503699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:24.452 [2024-11-29 05:33:35.503713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.452 [2024-11-29 05:33:35.503761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008305327744657062 len:4627 00:08:24.452 [2024-11-29 05:33:35.503775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.452 [2024-11-29 05:33:35.503823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1302286474461450770 len:42663 00:08:24.452 [2024-11-29 05:33:35.503837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.452 #45 NEW cov: 11814 ft: 13655 corp: 11/368b lim: 50 exec/s: 0 rss: 67Mb L: 47/47 MS: 1 CopyPart- 00:08:24.452 [2024-11-29 05:33:35.543704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:42663 00:08:24.452 [2024-11-29 05:33:35.543730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.452 [2024-11-29 05:33:35.543765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:24.452 [2024-11-29 05:33:35.543780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.452 [2024-11-29 05:33:35.543831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468021105829542 len:2571 00:08:24.452 [2024-11-29 05:33:35.543846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.452 #46 NEW cov: 11814 ft: 13709 corp: 12/398b lim: 50 exec/s: 0 rss: 67Mb L: 30/47 MS: 1 EraseBytes- 00:08:24.452 [2024-11-29 05:33:35.583672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691070396070 len:42664 00:08:24.452 [2024-11-29 05:33:35.583699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.452 [2024-11-29 05:33:35.583740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:24.452 [2024-11-29 05:33:35.583755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.452 [2024-11-29 05:33:35.583804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468691120727718 len:42663 00:08:24.452 [2024-11-29 05:33:35.583819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.452 #47 NEW cov: 11814 ft: 13793 corp: 13/435b lim: 50 exec/s: 0 rss: 67Mb L: 37/47 MS: 1 ChangeBinInt- 00:08:24.452 [2024-11-29 05:33:35.623778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18421692799305908223 len:65536 00:08:24.452 [2024-11-29 05:33:35.623804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.452 [2024-11-29 05:33:35.623852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:24.452 [2024-11-29 05:33:35.623867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.452 #52 NEW cov: 11814 ft: 13813 corp: 14/463b lim: 50 exec/s: 0 rss: 68Mb L: 28/47 MS: 5 CrossOver-ShuffleBytes-ShuffleBytes-CMP-InsertRepeatedBytes- DE: "\377\377\377\377"- 00:08:24.453 [2024-11-29 05:33:35.663881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18421692799305908223 len:65536 00:08:24.453 [2024-11-29 05:33:35.663908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.453 [2024-11-29 05:33:35.663944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:24.453 [2024-11-29 05:33:35.663960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.453 #53 NEW cov: 11814 ft: 13830 corp: 15/491b lim: 50 exec/s: 0 rss: 68Mb L: 28/47 MS: 1 ShuffleBytes- 00:08:24.453 [2024-11-29 05:33:35.704074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:42663 00:08:24.453 [2024-11-29 05:33:35.704101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.453 [2024-11-29 05:33:35.704136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1302123749233267218 len:42663 00:08:24.453 [2024-11-29 05:33:35.704152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.453 [2024-11-29 05:33:35.704203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468021105829542 len:2571 00:08:24.453 [2024-11-29 05:33:35.704218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.453 #54 NEW cov: 11814 ft: 13840 corp: 16/522b lim: 50 exec/s: 0 rss: 68Mb L: 31/47 MS: 1 EraseBytes- 00:08:24.744 [2024-11-29 05:33:35.744309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:168427520 len:1 00:08:24.744 [2024-11-29 05:33:35.744336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.744 [2024-11-29 05:33:35.744399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:24.744 [2024-11-29 05:33:35.744414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.744 [2024-11-29 05:33:35.744464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468052669628416 len:4627 00:08:24.744 [2024-11-29 05:33:35.744479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.744 [2024-11-29 05:33:35.744531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1302103242566668818 len:1 00:08:24.744 [2024-11-29 05:33:35.744547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.744 #55 NEW cov: 11814 ft: 13905 corp: 17/566b lim: 50 exec/s: 0 rss: 68Mb L: 44/47 MS: 1 CrossOver- 00:08:24.744 [2024-11-29 05:33:35.784178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:469762048 len:65536 00:08:24.744 [2024-11-29 05:33:35.784205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.744 [2024-11-29 05:33:35.784259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:24.744 [2024-11-29 05:33:35.784276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.744 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:24.744 #56 NEW cov: 11837 ft: 14001 corp: 18/594b lim: 50 exec/s: 0 rss: 68Mb L: 28/47 MS: 1 ChangeBinInt- 00:08:24.744 [2024-11-29 05:33:35.824525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:24.744 [2024-11-29 05:33:35.824553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.744 [2024-11-29 05:33:35.824604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:24.744 [2024-11-29 05:33:35.824620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.744 [2024-11-29 05:33:35.824668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:24.744 [2024-11-29 05:33:35.824683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.744 [2024-11-29 05:33:35.824733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:24.744 [2024-11-29 05:33:35.824747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.744 #59 NEW cov: 11837 ft: 14014 corp: 19/642b lim: 50 exec/s: 0 rss: 68Mb L: 48/48 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:24.744 [2024-11-29 05:33:35.864683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:42663 00:08:24.744 [2024-11-29 05:33:35.864710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.744 [2024-11-29 05:33:35.864756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1027659620853414 len:42663 00:08:24.745 [2024-11-29 05:33:35.864770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.745 [2024-11-29 05:33:35.864819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468691120727718 len:4627 00:08:24.745 [2024-11-29 05:33:35.864834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.745 [2024-11-29 05:33:35.864882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1302123111085380114 len:42663 00:08:24.745 [2024-11-29 05:33:35.864898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.745 #60 NEW cov: 11837 ft: 14063 corp: 20/691b lim: 50 exec/s: 60 rss: 68Mb L: 49/49 MS: 1 CMP- DE: "\000\003"- 00:08:24.745 [2024-11-29 05:33:35.904791] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:357862451 len:2571 00:08:24.745 [2024-11-29 05:33:35.904817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.745 [2024-11-29 05:33:35.904853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:24.745 [2024-11-29 05:33:35.904868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.745 [2024-11-29 05:33:35.904922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:24.745 [2024-11-29 05:33:35.904937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.745 [2024-11-29 05:33:35.904990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:24.745 [2024-11-29 05:33:35.905004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.745 #61 NEW cov: 11837 ft: 14108 corp: 21/731b lim: 50 exec/s: 61 rss: 68Mb L: 40/49 MS: 1 CMP- DE: "\025T\2143\000\000\000\000"- 00:08:24.745 [2024-11-29 05:33:35.944662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:42663 00:08:24.745 [2024-11-29 05:33:35.944689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.745 [2024-11-29 05:33:35.944733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468021105829542 len:2571 00:08:24.745 [2024-11-29 05:33:35.944747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.745 #62 NEW cov: 11837 ft: 14149 corp: 22/751b lim: 50 exec/s: 62 rss: 68Mb L: 20/49 MS: 1 EraseBytes- 00:08:24.745 [2024-11-29 05:33:35.984805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:4627 00:08:24.745 [2024-11-29 05:33:35.984831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.745 [2024-11-29 05:33:35.984864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468688627962386 len:42663 00:08:24.745 [2024-11-29 05:33:35.984879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.745 #63 NEW cov: 11837 ft: 14168 corp: 23/776b lim: 50 exec/s: 63 rss: 68Mb L: 25/49 MS: 1 EraseBytes- 00:08:24.745 [2024-11-29 05:33:36.024919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446646213058494463 len:65536 00:08:24.745 [2024-11-29 05:33:36.024947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.745 [2024-11-29 05:33:36.024994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:24.745 [2024-11-29 05:33:36.025009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.745 #64 NEW cov: 11837 ft: 14173 corp: 24/805b lim: 50 exec/s: 64 rss: 68Mb L: 29/49 MS: 1 CrossOver- 00:08:25.005 [2024-11-29 05:33:36.065116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:42663 00:08:25.005 [2024-11-29 05:33:36.065142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.065193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:25.005 [2024-11-29 05:33:36.065209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.065260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468021105829542 len:2571 00:08:25.005 [2024-11-29 05:33:36.065275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.005 #65 NEW cov: 11837 ft: 14194 corp: 25/835b lim: 50 exec/s: 65 rss: 68Mb L: 30/49 MS: 1 ShuffleBytes- 00:08:25.005 [2024-11-29 05:33:36.105343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:168427520 len:1 00:08:25.005 [2024-11-29 05:33:36.105371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.105410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:25.005 [2024-11-29 05:33:36.105426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.105481] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:25.005 [2024-11-29 05:33:36.105496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.105547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:25.005 [2024-11-29 05:33:36.105562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.005 #66 NEW cov: 11837 ft: 14199 corp: 26/881b lim: 50 exec/s: 66 rss: 68Mb L: 46/49 MS: 1 CopyPart- 00:08:25.005 [2024-11-29 05:33:36.145372] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:182192640 len:1 00:08:25.005 [2024-11-29 05:33:36.145398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.145436] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:25.005 [2024-11-29 05:33:36.145450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.145502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:25.005 [2024-11-29 05:33:36.145518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.005 #67 NEW cov: 11837 ft: 14242 corp: 27/914b lim: 50 exec/s: 67 rss: 68Mb L: 33/49 MS: 1 InsertByte- 00:08:25.005 [2024-11-29 05:33:36.175320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:4627 00:08:25.005 [2024-11-29 05:33:36.175347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.175388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008477484720984594 len:42663 00:08:25.005 [2024-11-29 05:33:36.175402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.005 #68 NEW cov: 11837 ft: 14255 corp: 28/939b lim: 50 exec/s: 68 rss: 68Mb L: 25/49 MS: 1 ChangeBit- 00:08:25.005 [2024-11-29 05:33:36.215446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:4627 00:08:25.005 [2024-11-29 05:33:36.215474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.215509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008476861950726674 len:21645 00:08:25.005 [2024-11-29 05:33:36.215525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.005 #69 NEW cov: 11837 ft: 14312 corp: 29/964b lim: 50 exec/s: 69 rss: 68Mb L: 25/49 MS: 1 PersAutoDict- DE: "\025T\2143\000\000\000\000"- 00:08:25.005 [2024-11-29 05:33:36.255671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:168427520 len:1 00:08:25.005 [2024-11-29 05:33:36.255700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.255751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:67108864 len:1 00:08:25.005 [2024-11-29 05:33:36.255767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.255819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:25.005 [2024-11-29 05:33:36.255837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.005 #70 NEW cov: 11837 ft: 14318 corp: 30/996b lim: 50 exec/s: 70 rss: 69Mb L: 32/49 MS: 1 ChangeBit- 00:08:25.005 [2024-11-29 05:33:36.295869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:25.005 [2024-11-29 05:33:36.295897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.295939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446743502478901247 len:65536 00:08:25.005 [2024-11-29 05:33:36.295954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.296001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:25.005 [2024-11-29 05:33:36.296016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.005 [2024-11-29 05:33:36.296066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:25.005 [2024-11-29 05:33:36.296080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.265 #71 NEW cov: 11837 ft: 14338 corp: 31/1044b lim: 50 exec/s: 71 rss: 69Mb L: 48/49 MS: 1 ChangeByte- 00:08:25.265 [2024-11-29 05:33:36.335897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:42663 00:08:25.265 [2024-11-29 05:33:36.335925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.265 [2024-11-29 05:33:36.335960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1302123749233267218 len:42663 00:08:25.265 [2024-11-29 05:33:36.335974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.265 [2024-11-29 05:33:36.336024] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8908288953810944 len:2571 00:08:25.265 [2024-11-29 05:33:36.336039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.265 #72 NEW cov: 11837 ft: 14353 corp: 32/1075b lim: 50 exec/s: 72 rss: 69Mb L: 31/49 MS: 1 ChangeBinInt- 00:08:25.265 [2024-11-29 05:33:36.375817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11964375360241641126 len:1 00:08:25.265 [2024-11-29 05:33:36.375844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.265 #73 NEW cov: 11837 ft: 14653 corp: 33/1088b lim: 50 exec/s: 73 rss: 69Mb L: 13/49 MS: 1 CrossOver- 00:08:25.265 [2024-11-29 05:33:36.416120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727610 len:42663 00:08:25.265 [2024-11-29 05:33:36.416148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.265 [2024-11-29 05:33:36.416187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:25.265 [2024-11-29 05:33:36.416203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.265 [2024-11-29 05:33:36.416254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468021105829542 len:2571 00:08:25.265 [2024-11-29 05:33:36.416270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.265 #74 NEW cov: 11837 ft: 14663 corp: 34/1118b lim: 50 exec/s: 74 rss: 69Mb L: 30/49 MS: 1 ChangeByte- 00:08:25.265 [2024-11-29 05:33:36.456143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:357862451 len:2571 00:08:25.265 [2024-11-29 05:33:36.456170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.265 [2024-11-29 05:33:36.456204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:25.265 [2024-11-29 05:33:36.456219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.265 #75 NEW cov: 11837 ft: 14691 corp: 35/1144b lim: 50 exec/s: 75 rss: 69Mb L: 26/49 MS: 1 EraseBytes- 00:08:25.265 [2024-11-29 05:33:36.496261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:357862400 len:1 00:08:25.265 [2024-11-29 05:33:36.496290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.265 [2024-11-29 05:33:36.496330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:25.265 [2024-11-29 05:33:36.496346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.265 #76 NEW cov: 11837 ft: 14716 corp: 36/1170b lim: 50 exec/s: 76 rss: 69Mb L: 26/49 MS: 1 CopyPart- 00:08:25.265 [2024-11-29 05:33:36.536616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691137504934 len:42663 00:08:25.265 [2024-11-29 05:33:36.536644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.265 [2024-11-29 05:33:36.536683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:25.265 [2024-11-29 05:33:36.536699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.265 [2024-11-29 05:33:36.536750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468691120727718 len:42663 00:08:25.265 [2024-11-29 05:33:36.536764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.265 [2024-11-29 05:33:36.536816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:723672163624068618 len:65536 00:08:25.265 [2024-11-29 05:33:36.536831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.265 #77 NEW cov: 11837 ft: 14723 corp: 37/1219b lim: 50 exec/s: 77 rss: 69Mb L: 49/49 MS: 1 CrossOver- 00:08:25.525 [2024-11-29 05:33:36.576480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:4627 00:08:25.525 [2024-11-29 05:33:36.576508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.525 [2024-11-29 05:33:36.576559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008477484720984594 len:42663 00:08:25.525 [2024-11-29 05:33:36.576574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.525 #78 NEW cov: 11837 ft: 14754 corp: 38/1244b lim: 50 exec/s: 78 rss: 69Mb L: 25/49 MS: 1 ChangeBit- 00:08:25.525 [2024-11-29 05:33:36.616531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:4627 00:08:25.525 [2024-11-29 05:33:36.616558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.525 [2024-11-29 05:33:36.616602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008476861950726674 len:21645 00:08:25.525 [2024-11-29 05:33:36.616621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.526 #79 NEW cov: 11837 ft: 14848 corp: 39/1269b lim: 50 exec/s: 79 rss: 69Mb L: 25/49 MS: 1 ShuffleBytes- 00:08:25.526 [2024-11-29 05:33:36.656874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:42663 00:08:25.526 [2024-11-29 05:33:36.656902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.656939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:42663 00:08:25.526 [2024-11-29 05:33:36.656954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.657008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:11997046328676230822 len:4627 00:08:25.526 [2024-11-29 05:33:36.657023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.657077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1302286474461450770 len:42663 00:08:25.526 [2024-11-29 05:33:36.657092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.526 #80 NEW cov: 11837 ft: 14863 corp: 40/1316b lim: 50 exec/s: 80 rss: 69Mb L: 47/49 MS: 1 ChangeByte- 00:08:25.526 [2024-11-29 05:33:36.696780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691120727718 len:42663 00:08:25.526 [2024-11-29 05:33:36.696808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.696860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468021105829542 len:3329 00:08:25.526 [2024-11-29 05:33:36.696876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.526 #81 NEW cov: 11837 ft: 14904 corp: 41/1336b lim: 50 exec/s: 81 rss: 69Mb L: 20/49 MS: 1 CMP- DE: "\015\000"- 00:08:25.526 [2024-11-29 05:33:36.737038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691137504934 len:42663 00:08:25.526 [2024-11-29 05:33:36.737066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.737101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468691120727718 len:60583 00:08:25.526 [2024-11-29 05:33:36.737117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.737170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468691120727718 len:42507 00:08:25.526 [2024-11-29 05:33:36.737186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.526 #82 NEW cov: 11837 ft: 14924 corp: 42/1369b lim: 50 exec/s: 82 rss: 69Mb L: 33/49 MS: 1 EraseBytes- 00:08:25.526 [2024-11-29 05:33:36.777230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:25.526 [2024-11-29 05:33:36.777258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.777296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15855485438111252490 len:1 00:08:25.526 [2024-11-29 05:33:36.777315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.777367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:25.526 [2024-11-29 05:33:36.777382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.777433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:25.526 [2024-11-29 05:33:36.777448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.526 #83 NEW cov: 11837 ft: 14932 corp: 43/1415b lim: 50 exec/s: 83 rss: 69Mb L: 46/49 MS: 1 InsertRepeatedBytes- 00:08:25.526 [2024-11-29 05:33:36.817360] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:168427520 len:1 00:08:25.526 [2024-11-29 05:33:36.817386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.817422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:25.526 [2024-11-29 05:33:36.817437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.817489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:25.526 [2024-11-29 05:33:36.817504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.526 [2024-11-29 05:33:36.817575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:25.526 [2024-11-29 05:33:36.817591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.786 #84 NEW cov: 11837 ft: 14947 corp: 44/1461b lim: 50 exec/s: 84 rss: 70Mb L: 46/49 MS: 1 ShuffleBytes- 00:08:25.786 [2024-11-29 05:33:36.857398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691070396070 len:42664 00:08:25.786 [2024-11-29 05:33:36.857425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.786 [2024-11-29 05:33:36.857460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468265918965414 len:42663 00:08:25.786 [2024-11-29 05:33:36.857476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.786 [2024-11-29 05:33:36.857529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12008468691120727718 len:42663 00:08:25.786 [2024-11-29 05:33:36.857543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.786 [2024-11-29 05:33:36.897594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12008468691070396070 len:42664 00:08:25.786 [2024-11-29 05:33:36.897625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.786 [2024-11-29 05:33:36.897674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12008468265918965414 len:42498 00:08:25.786 [2024-11-29 05:33:36.897690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.786 [2024-11-29 05:33:36.897741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:712964571136 len:42663 00:08:25.786 [2024-11-29 05:33:36.897757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.786 [2024-11-29 05:33:36.897813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12008468691120727718 len:42663 00:08:25.786 [2024-11-29 05:33:36.897828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.786 #86 NEW cov: 11837 ft: 14966 corp: 45/1507b lim: 50 exec/s: 43 rss: 70Mb L: 46/49 MS: 2 InsertByte-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:25.786 #86 DONE cov: 11837 ft: 14966 corp: 45/1507b lim: 50 exec/s: 43 rss: 70Mb 00:08:25.786 ###### Recommended dictionary. ###### 00:08:25.786 "\377\377\377\377" # Uses: 0 00:08:25.786 "\000\003" # Uses: 0 00:08:25.786 "\025T\2143\000\000\000\000" # Uses: 1 00:08:25.786 "\015\000" # Uses: 0 00:08:25.786 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:25.786 ###### End of recommended dictionary. ###### 00:08:25.786 Done 86 runs in 2 second(s) 00:08:25.786 05:33:37 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:25.786 05:33:37 -- ../common.sh@72 -- # (( i++ )) 00:08:25.786 05:33:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.786 05:33:37 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:25.786 05:33:37 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:25.786 05:33:37 -- nvmf/run.sh@24 -- # local timen=1 00:08:25.786 05:33:37 -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.786 05:33:37 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:25.786 05:33:37 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:25.786 05:33:37 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:25.786 05:33:37 -- nvmf/run.sh@29 -- # port=4420 00:08:25.786 05:33:37 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:25.786 05:33:37 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:25.786 05:33:37 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.786 05:33:37 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:25.786 [2024-11-29 05:33:37.073646] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:25.786 [2024-11-29 05:33:37.073720] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2222554 ] 00:08:26.046 EAL: No free 2048 kB hugepages reported on node 1 00:08:26.046 [2024-11-29 05:33:37.248098] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.046 [2024-11-29 05:33:37.267457] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:26.046 [2024-11-29 05:33:37.267577] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.046 [2024-11-29 05:33:37.319180] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:26.046 [2024-11-29 05:33:37.335532] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:26.306 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.306 INFO: Seed: 2074575338 00:08:26.306 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:26.306 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:26.306 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:26.306 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.306 #2 INITED exec/s: 0 rss: 60Mb 00:08:26.306 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.306 This may also happen if the target rejected all inputs we tried so far 00:08:26.306 [2024-11-29 05:33:37.380828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.306 [2024-11-29 05:33:37.380861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.306 [2024-11-29 05:33:37.380919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.306 [2024-11-29 05:33:37.380934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.306 [2024-11-29 05:33:37.380988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.306 [2024-11-29 05:33:37.381003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.567 NEW_FUNC[1/671]: 0x473598 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:26.567 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.567 #7 NEW cov: 11662 ft: 11669 corp: 2/69b lim: 90 exec/s: 0 rss: 67Mb L: 68/68 MS: 5 ChangeBit-CrossOver-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:26.567 [2024-11-29 05:33:37.681613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.567 [2024-11-29 05:33:37.681647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.567 [2024-11-29 05:33:37.681698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.567 [2024-11-29 05:33:37.681714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.567 [2024-11-29 05:33:37.681767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.567 [2024-11-29 05:33:37.681783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.567 NEW_FUNC[1/1]: 0x1c78a18 in spdk_thread_get_last_tsc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1312 00:08:26.567 #11 NEW cov: 11781 ft: 12065 corp: 3/134b lim: 90 exec/s: 0 rss: 67Mb L: 65/68 MS: 4 ShuffleBytes-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:26.567 [2024-11-29 05:33:37.721614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.567 [2024-11-29 05:33:37.721645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.567 [2024-11-29 05:33:37.721697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.567 [2024-11-29 05:33:37.721713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.567 [2024-11-29 05:33:37.721766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.567 [2024-11-29 05:33:37.721782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.567 #12 NEW cov: 11787 ft: 12387 corp: 4/202b lim: 90 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 ChangeBinInt- 00:08:26.567 [2024-11-29 05:33:37.761768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.567 [2024-11-29 05:33:37.761797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.567 [2024-11-29 05:33:37.761833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.567 [2024-11-29 05:33:37.761849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.567 [2024-11-29 05:33:37.761902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.567 [2024-11-29 05:33:37.761918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.567 #13 NEW cov: 11872 ft: 12719 corp: 5/267b lim: 90 exec/s: 0 rss: 67Mb L: 65/68 MS: 1 CrossOver- 00:08:26.567 [2024-11-29 05:33:37.801856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.567 [2024-11-29 05:33:37.801883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.567 [2024-11-29 05:33:37.801920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.567 [2024-11-29 05:33:37.801935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.567 [2024-11-29 05:33:37.801986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.567 [2024-11-29 05:33:37.802002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.567 #14 NEW cov: 11872 ft: 12805 corp: 6/335b lim: 90 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 ChangeByte- 00:08:26.567 [2024-11-29 05:33:37.841975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.567 [2024-11-29 05:33:37.842003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.567 [2024-11-29 05:33:37.842038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.567 [2024-11-29 05:33:37.842053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.567 [2024-11-29 05:33:37.842106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.567 [2024-11-29 05:33:37.842122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.827 #15 NEW cov: 11872 ft: 12847 corp: 7/400b lim: 90 exec/s: 0 rss: 68Mb L: 65/68 MS: 1 CMP- DE: "\003\000"- 00:08:26.827 [2024-11-29 05:33:37.882211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.827 [2024-11-29 05:33:37.882239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:37.882297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.827 [2024-11-29 05:33:37.882313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:37.882365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.827 [2024-11-29 05:33:37.882380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:37.882432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:26.827 [2024-11-29 05:33:37.882448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.827 #16 NEW cov: 11872 ft: 13260 corp: 8/474b lim: 90 exec/s: 0 rss: 68Mb L: 74/74 MS: 1 InsertRepeatedBytes- 00:08:26.827 [2024-11-29 05:33:37.922146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.827 [2024-11-29 05:33:37.922172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:37.922210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.827 [2024-11-29 05:33:37.922225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:37.922281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.827 [2024-11-29 05:33:37.922296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.827 #17 NEW cov: 11872 ft: 13333 corp: 9/542b lim: 90 exec/s: 0 rss: 68Mb L: 68/74 MS: 1 ShuffleBytes- 00:08:26.827 [2024-11-29 05:33:37.962279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.827 [2024-11-29 05:33:37.962306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:37.962369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.827 [2024-11-29 05:33:37.962384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:37.962437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.827 [2024-11-29 05:33:37.962469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.827 #18 NEW cov: 11872 ft: 13387 corp: 10/610b lim: 90 exec/s: 0 rss: 68Mb L: 68/74 MS: 1 ChangeByte- 00:08:26.827 [2024-11-29 05:33:38.002098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.827 [2024-11-29 05:33:38.002126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.827 #19 NEW cov: 11872 ft: 14308 corp: 11/642b lim: 90 exec/s: 0 rss: 68Mb L: 32/74 MS: 1 CrossOver- 00:08:26.827 [2024-11-29 05:33:38.042684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.827 [2024-11-29 05:33:38.042712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:38.042752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.827 [2024-11-29 05:33:38.042768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:38.042818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.827 [2024-11-29 05:33:38.042831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:38.042882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:26.827 [2024-11-29 05:33:38.042896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.827 #20 NEW cov: 11872 ft: 14342 corp: 12/725b lim: 90 exec/s: 0 rss: 68Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:08:26.827 [2024-11-29 05:33:38.082806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.827 [2024-11-29 05:33:38.082834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:38.082895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.827 [2024-11-29 05:33:38.082911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:38.082966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:26.827 [2024-11-29 05:33:38.082982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:38.083033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:26.827 [2024-11-29 05:33:38.083051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.827 #21 NEW cov: 11872 ft: 14366 corp: 13/802b lim: 90 exec/s: 0 rss: 68Mb L: 77/83 MS: 1 InsertRepeatedBytes- 00:08:26.827 [2024-11-29 05:33:38.122612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:26.827 [2024-11-29 05:33:38.122639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.827 [2024-11-29 05:33:38.122677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:26.827 [2024-11-29 05:33:38.122692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.087 #22 NEW cov: 11872 ft: 14736 corp: 14/839b lim: 90 exec/s: 0 rss: 68Mb L: 37/83 MS: 1 InsertRepeatedBytes- 00:08:27.087 [2024-11-29 05:33:38.162904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.087 [2024-11-29 05:33:38.162931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.087 [2024-11-29 05:33:38.162970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.087 [2024-11-29 05:33:38.162985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.087 [2024-11-29 05:33:38.163036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.087 [2024-11-29 05:33:38.163051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.087 #23 NEW cov: 11872 ft: 14823 corp: 15/907b lim: 90 exec/s: 0 rss: 68Mb L: 68/83 MS: 1 ChangeBit- 00:08:27.087 [2024-11-29 05:33:38.203128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.087 [2024-11-29 05:33:38.203155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.087 [2024-11-29 05:33:38.203203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.087 [2024-11-29 05:33:38.203218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.087 [2024-11-29 05:33:38.203269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.087 [2024-11-29 05:33:38.203283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.087 [2024-11-29 05:33:38.203334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:27.087 [2024-11-29 05:33:38.203349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.087 #24 NEW cov: 11872 ft: 14849 corp: 16/995b lim: 90 exec/s: 0 rss: 68Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:08:27.087 [2024-11-29 05:33:38.243275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.087 [2024-11-29 05:33:38.243303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.087 [2024-11-29 05:33:38.243366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.087 [2024-11-29 05:33:38.243382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.087 [2024-11-29 05:33:38.243432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.087 [2024-11-29 05:33:38.243448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.087 [2024-11-29 05:33:38.243504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:27.087 [2024-11-29 05:33:38.243519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.087 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:27.087 #30 NEW cov: 11895 ft: 14867 corp: 17/1078b lim: 90 exec/s: 0 rss: 68Mb L: 83/88 MS: 1 ChangeByte- 00:08:27.087 [2024-11-29 05:33:38.293400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.087 [2024-11-29 05:33:38.293427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.087 [2024-11-29 05:33:38.293475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.087 [2024-11-29 05:33:38.293491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.087 [2024-11-29 05:33:38.293541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.087 [2024-11-29 05:33:38.293555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.087 [2024-11-29 05:33:38.293611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:27.087 [2024-11-29 05:33:38.293626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.087 #31 NEW cov: 11895 ft: 14918 corp: 18/1161b lim: 90 exec/s: 0 rss: 68Mb L: 83/88 MS: 1 ChangeBinInt- 00:08:27.087 [2024-11-29 05:33:38.333353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.087 [2024-11-29 05:33:38.333380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.088 [2024-11-29 05:33:38.333417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.088 [2024-11-29 05:33:38.333432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.088 [2024-11-29 05:33:38.333487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.088 [2024-11-29 05:33:38.333502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.088 #37 NEW cov: 11895 ft: 14936 corp: 19/1226b lim: 90 exec/s: 0 rss: 68Mb L: 65/88 MS: 1 ChangeBit- 00:08:27.088 [2024-11-29 05:33:38.373200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.088 [2024-11-29 05:33:38.373229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.348 #38 NEW cov: 11895 ft: 14973 corp: 20/1259b lim: 90 exec/s: 38 rss: 68Mb L: 33/88 MS: 1 InsertByte- 00:08:27.348 [2024-11-29 05:33:38.413594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.348 [2024-11-29 05:33:38.413624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.413672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.348 [2024-11-29 05:33:38.413687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.413740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.348 [2024-11-29 05:33:38.413755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.348 #39 NEW cov: 11895 ft: 14983 corp: 21/1324b lim: 90 exec/s: 39 rss: 68Mb L: 65/88 MS: 1 ShuffleBytes- 00:08:27.348 [2024-11-29 05:33:38.453878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.348 [2024-11-29 05:33:38.453906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.453953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.348 [2024-11-29 05:33:38.453968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.454018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.348 [2024-11-29 05:33:38.454034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.454085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:27.348 [2024-11-29 05:33:38.454100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.348 #40 NEW cov: 11895 ft: 15060 corp: 22/1411b lim: 90 exec/s: 40 rss: 68Mb L: 87/88 MS: 1 CMP- DE: "\377\377\377\000"- 00:08:27.348 [2024-11-29 05:33:38.493860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.348 [2024-11-29 05:33:38.493888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.493928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.348 [2024-11-29 05:33:38.493943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.493996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.348 [2024-11-29 05:33:38.494026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.348 #41 NEW cov: 11895 ft: 15065 corp: 23/1481b lim: 90 exec/s: 41 rss: 68Mb L: 70/88 MS: 1 PersAutoDict- DE: "\003\000"- 00:08:27.348 [2024-11-29 05:33:38.533980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.348 [2024-11-29 05:33:38.534007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.534045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.348 [2024-11-29 05:33:38.534060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.534112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.348 [2024-11-29 05:33:38.534127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.348 #42 NEW cov: 11895 ft: 15084 corp: 24/1546b lim: 90 exec/s: 42 rss: 69Mb L: 65/88 MS: 1 PersAutoDict- DE: "\003\000"- 00:08:27.348 [2024-11-29 05:33:38.573897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.348 [2024-11-29 05:33:38.573924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.573977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.348 [2024-11-29 05:33:38.573993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.348 #43 NEW cov: 11895 ft: 15148 corp: 25/1597b lim: 90 exec/s: 43 rss: 69Mb L: 51/88 MS: 1 EraseBytes- 00:08:27.348 [2024-11-29 05:33:38.614131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.348 [2024-11-29 05:33:38.614157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.614211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.348 [2024-11-29 05:33:38.614226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.348 [2024-11-29 05:33:38.614281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.348 [2024-11-29 05:33:38.614297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.348 #44 NEW cov: 11895 ft: 15169 corp: 26/1662b lim: 90 exec/s: 44 rss: 69Mb L: 65/88 MS: 1 ShuffleBytes- 00:08:27.608 [2024-11-29 05:33:38.654277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.608 [2024-11-29 05:33:38.654304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.654342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.608 [2024-11-29 05:33:38.654357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.654414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.608 [2024-11-29 05:33:38.654430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.608 #45 NEW cov: 11895 ft: 15179 corp: 27/1727b lim: 90 exec/s: 45 rss: 69Mb L: 65/88 MS: 1 ChangeBinInt- 00:08:27.608 [2024-11-29 05:33:38.694235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.608 [2024-11-29 05:33:38.694262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.694311] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.608 [2024-11-29 05:33:38.694327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.608 #49 NEW cov: 11895 ft: 15199 corp: 28/1767b lim: 90 exec/s: 49 rss: 69Mb L: 40/88 MS: 4 ChangeBit-ChangeBit-ShuffleBytes-CrossOver- 00:08:27.608 [2024-11-29 05:33:38.734658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.608 [2024-11-29 05:33:38.734686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.734745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.608 [2024-11-29 05:33:38.734761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.734814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.608 [2024-11-29 05:33:38.734831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.734883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:27.608 [2024-11-29 05:33:38.734899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.608 #50 NEW cov: 11895 ft: 15226 corp: 29/1840b lim: 90 exec/s: 50 rss: 69Mb L: 73/88 MS: 1 CopyPart- 00:08:27.608 [2024-11-29 05:33:38.774789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.608 [2024-11-29 05:33:38.774820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.774856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.608 [2024-11-29 05:33:38.774872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.774926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.608 [2024-11-29 05:33:38.774942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.775014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:27.608 [2024-11-29 05:33:38.775028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.608 #51 NEW cov: 11895 ft: 15237 corp: 30/1918b lim: 90 exec/s: 51 rss: 69Mb L: 78/88 MS: 1 InsertRepeatedBytes- 00:08:27.608 [2024-11-29 05:33:38.814754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.608 [2024-11-29 05:33:38.814779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.814832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.608 [2024-11-29 05:33:38.814848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.814899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.608 [2024-11-29 05:33:38.814913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.608 #52 NEW cov: 11895 ft: 15265 corp: 31/1987b lim: 90 exec/s: 52 rss: 69Mb L: 69/88 MS: 1 InsertByte- 00:08:27.608 [2024-11-29 05:33:38.854874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.608 [2024-11-29 05:33:38.854901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.608 [2024-11-29 05:33:38.854940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.609 [2024-11-29 05:33:38.854955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.609 [2024-11-29 05:33:38.855010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.609 [2024-11-29 05:33:38.855026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.609 #53 NEW cov: 11895 ft: 15289 corp: 32/2056b lim: 90 exec/s: 53 rss: 69Mb L: 69/88 MS: 1 PersAutoDict- DE: "\377\377\377\000"- 00:08:27.609 [2024-11-29 05:33:38.894824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.609 [2024-11-29 05:33:38.894851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.609 [2024-11-29 05:33:38.894896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.609 [2024-11-29 05:33:38.894911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.868 #54 NEW cov: 11895 ft: 15311 corp: 33/2096b lim: 90 exec/s: 54 rss: 69Mb L: 40/88 MS: 1 EraseBytes- 00:08:27.868 [2024-11-29 05:33:38.935250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.868 [2024-11-29 05:33:38.935275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.868 [2024-11-29 05:33:38.935317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.868 [2024-11-29 05:33:38.935332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.868 [2024-11-29 05:33:38.935385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.868 [2024-11-29 05:33:38.935401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.868 [2024-11-29 05:33:38.935453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:27.868 [2024-11-29 05:33:38.935468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.868 #55 NEW cov: 11895 ft: 15314 corp: 34/2179b lim: 90 exec/s: 55 rss: 69Mb L: 83/88 MS: 1 InsertRepeatedBytes- 00:08:27.868 [2024-11-29 05:33:38.975303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.868 [2024-11-29 05:33:38.975329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.868 [2024-11-29 05:33:38.975379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.868 [2024-11-29 05:33:38.975395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.868 [2024-11-29 05:33:38.975446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.868 [2024-11-29 05:33:38.975461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.868 [2024-11-29 05:33:38.975512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:27.868 [2024-11-29 05:33:38.975527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.868 #56 NEW cov: 11895 ft: 15319 corp: 35/2252b lim: 90 exec/s: 56 rss: 69Mb L: 73/88 MS: 1 CopyPart- 00:08:27.869 [2024-11-29 05:33:39.015300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.869 [2024-11-29 05:33:39.015327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.869 [2024-11-29 05:33:39.015364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.869 [2024-11-29 05:33:39.015379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.869 [2024-11-29 05:33:39.015429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.869 [2024-11-29 05:33:39.015444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.869 [2024-11-29 05:33:39.055415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.869 [2024-11-29 05:33:39.055442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.869 [2024-11-29 05:33:39.055480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.869 [2024-11-29 05:33:39.055496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.869 [2024-11-29 05:33:39.055549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.869 [2024-11-29 05:33:39.055564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.869 #58 NEW cov: 11895 ft: 15356 corp: 36/2309b lim: 90 exec/s: 58 rss: 69Mb L: 57/88 MS: 2 EraseBytes-InsertByte- 00:08:27.869 [2024-11-29 05:33:39.095677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.869 [2024-11-29 05:33:39.095704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.869 [2024-11-29 05:33:39.095746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.869 [2024-11-29 05:33:39.095762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.869 [2024-11-29 05:33:39.095814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.869 [2024-11-29 05:33:39.095828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.869 [2024-11-29 05:33:39.095879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:27.869 [2024-11-29 05:33:39.095894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.869 #59 NEW cov: 11895 ft: 15395 corp: 37/2392b lim: 90 exec/s: 59 rss: 69Mb L: 83/88 MS: 1 ChangeByte- 00:08:27.869 [2024-11-29 05:33:39.135706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.869 [2024-11-29 05:33:39.135733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.869 [2024-11-29 05:33:39.135769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.869 [2024-11-29 05:33:39.135783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.869 [2024-11-29 05:33:39.135838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.869 [2024-11-29 05:33:39.135853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.869 #60 NEW cov: 11895 ft: 15416 corp: 38/2462b lim: 90 exec/s: 60 rss: 69Mb L: 70/88 MS: 1 EraseBytes- 00:08:28.129 [2024-11-29 05:33:39.175809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.129 [2024-11-29 05:33:39.175836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.175879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.129 [2024-11-29 05:33:39.175894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.175946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.129 [2024-11-29 05:33:39.175961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.129 #61 NEW cov: 11895 ft: 15420 corp: 39/2530b lim: 90 exec/s: 61 rss: 69Mb L: 68/88 MS: 1 ChangeByte- 00:08:28.129 [2024-11-29 05:33:39.216066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.129 [2024-11-29 05:33:39.216094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.216140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.129 [2024-11-29 05:33:39.216156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.216208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.129 [2024-11-29 05:33:39.216226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.216277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:28.129 [2024-11-29 05:33:39.216292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.256015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.129 [2024-11-29 05:33:39.256043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.256079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.129 [2024-11-29 05:33:39.256094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.256148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.129 [2024-11-29 05:33:39.256163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.129 #63 NEW cov: 11895 ft: 15432 corp: 40/2591b lim: 90 exec/s: 63 rss: 70Mb L: 61/88 MS: 2 ChangeBinInt-EraseBytes- 00:08:28.129 [2024-11-29 05:33:39.296248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.129 [2024-11-29 05:33:39.296276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.296329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.129 [2024-11-29 05:33:39.296345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.296400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.129 [2024-11-29 05:33:39.296414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.296468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:28.129 [2024-11-29 05:33:39.296483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.129 #64 NEW cov: 11895 ft: 15441 corp: 41/2679b lim: 90 exec/s: 64 rss: 70Mb L: 88/88 MS: 1 ChangeBinInt- 00:08:28.129 [2024-11-29 05:33:39.336202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.129 [2024-11-29 05:33:39.336229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.336267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.129 [2024-11-29 05:33:39.336282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.336333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.129 [2024-11-29 05:33:39.336348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.129 #65 NEW cov: 11895 ft: 15464 corp: 42/2744b lim: 90 exec/s: 65 rss: 70Mb L: 65/88 MS: 1 ChangeBinInt- 00:08:28.129 [2024-11-29 05:33:39.376368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.129 [2024-11-29 05:33:39.376397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.376438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.129 [2024-11-29 05:33:39.376457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.376512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.129 [2024-11-29 05:33:39.376528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.416506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.129 [2024-11-29 05:33:39.416534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.416573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.129 [2024-11-29 05:33:39.416588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.129 [2024-11-29 05:33:39.416642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.129 [2024-11-29 05:33:39.416658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.389 #67 NEW cov: 11895 ft: 15511 corp: 43/2801b lim: 90 exec/s: 33 rss: 70Mb L: 57/88 MS: 2 CrossOver-PersAutoDict- DE: "\003\000"- 00:08:28.389 #67 DONE cov: 11895 ft: 15511 corp: 43/2801b lim: 90 exec/s: 33 rss: 70Mb 00:08:28.389 ###### Recommended dictionary. ###### 00:08:28.389 "\003\000" # Uses: 6 00:08:28.389 "\377\377\377\000" # Uses: 1 00:08:28.389 ###### End of recommended dictionary. ###### 00:08:28.389 Done 67 runs in 2 second(s) 00:08:28.389 05:33:39 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:28.389 05:33:39 -- ../common.sh@72 -- # (( i++ )) 00:08:28.389 05:33:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.389 05:33:39 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:28.389 05:33:39 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:28.389 05:33:39 -- nvmf/run.sh@24 -- # local timen=1 00:08:28.389 05:33:39 -- nvmf/run.sh@25 -- # local core=0x1 00:08:28.389 05:33:39 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:28.389 05:33:39 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:28.389 05:33:39 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:28.389 05:33:39 -- nvmf/run.sh@29 -- # port=4421 00:08:28.389 05:33:39 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:28.389 05:33:39 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:28.389 05:33:39 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:28.389 05:33:39 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:28.389 [2024-11-29 05:33:39.592951] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:28.389 [2024-11-29 05:33:39.593031] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2222994 ] 00:08:28.389 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.649 [2024-11-29 05:33:39.776207] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.649 [2024-11-29 05:33:39.795785] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:28.649 [2024-11-29 05:33:39.795910] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.649 [2024-11-29 05:33:39.847231] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.649 [2024-11-29 05:33:39.863586] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:28.649 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.649 INFO: Seed: 307594530 00:08:28.649 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:28.649 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:28.649 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:28.649 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.649 #2 INITED exec/s: 0 rss: 59Mb 00:08:28.649 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.649 This may also happen if the target rejected all inputs we tried so far 00:08:28.649 [2024-11-29 05:33:39.919138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.649 [2024-11-29 05:33:39.919169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.649 [2024-11-29 05:33:39.919206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.649 [2024-11-29 05:33:39.919222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.649 [2024-11-29 05:33:39.919278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.649 [2024-11-29 05:33:39.919292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.649 [2024-11-29 05:33:39.919349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.649 [2024-11-29 05:33:39.919364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.908 NEW_FUNC[1/672]: 0x4767c8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:28.908 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:28.908 #13 NEW cov: 11643 ft: 11644 corp: 2/48b lim: 50 exec/s: 0 rss: 67Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:29.167 [2024-11-29 05:33:40.220032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.167 [2024-11-29 05:33:40.220079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.220140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.167 [2024-11-29 05:33:40.220159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.220217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.167 [2024-11-29 05:33:40.220245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.220307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.167 [2024-11-29 05:33:40.220325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.167 #14 NEW cov: 11756 ft: 12065 corp: 3/97b lim: 50 exec/s: 0 rss: 67Mb L: 49/49 MS: 1 CopyPart- 00:08:29.167 [2024-11-29 05:33:40.269689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.167 [2024-11-29 05:33:40.269719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.269765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.167 [2024-11-29 05:33:40.269784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.167 #23 NEW cov: 11762 ft: 12784 corp: 4/118b lim: 50 exec/s: 0 rss: 67Mb L: 21/49 MS: 4 ChangeBit-ShuffleBytes-ChangeByte-CrossOver- 00:08:29.167 [2024-11-29 05:33:40.310129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.167 [2024-11-29 05:33:40.310158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.310208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.167 [2024-11-29 05:33:40.310225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.310281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.167 [2024-11-29 05:33:40.310297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.310354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.167 [2024-11-29 05:33:40.310372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.167 #24 NEW cov: 11847 ft: 13101 corp: 5/165b lim: 50 exec/s: 0 rss: 67Mb L: 47/49 MS: 1 ChangeByte- 00:08:29.167 [2024-11-29 05:33:40.350395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.167 [2024-11-29 05:33:40.350422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.350475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.167 [2024-11-29 05:33:40.350489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.350544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.167 [2024-11-29 05:33:40.350559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.350619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.167 [2024-11-29 05:33:40.350633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.350690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:29.167 [2024-11-29 05:33:40.350705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.167 #25 NEW cov: 11847 ft: 13308 corp: 6/215b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:29.167 [2024-11-29 05:33:40.390344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.167 [2024-11-29 05:33:40.390371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.390415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.167 [2024-11-29 05:33:40.390431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.167 [2024-11-29 05:33:40.390484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.168 [2024-11-29 05:33:40.390499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.168 [2024-11-29 05:33:40.390555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.168 [2024-11-29 05:33:40.390572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.168 #26 NEW cov: 11847 ft: 13457 corp: 7/264b lim: 50 exec/s: 0 rss: 67Mb L: 49/50 MS: 1 ChangeBinInt- 00:08:29.168 [2024-11-29 05:33:40.430565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.168 [2024-11-29 05:33:40.430594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.168 [2024-11-29 05:33:40.430652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.168 [2024-11-29 05:33:40.430668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.168 [2024-11-29 05:33:40.430723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.168 [2024-11-29 05:33:40.430738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.168 [2024-11-29 05:33:40.430794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.168 [2024-11-29 05:33:40.430810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.168 [2024-11-29 05:33:40.430867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:29.168 [2024-11-29 05:33:40.430883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.168 #27 NEW cov: 11847 ft: 13629 corp: 8/314b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 CopyPart- 00:08:29.427 [2024-11-29 05:33:40.480774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.428 [2024-11-29 05:33:40.480801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.480866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.428 [2024-11-29 05:33:40.480880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.480939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.428 [2024-11-29 05:33:40.480956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.481015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.428 [2024-11-29 05:33:40.481032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.481092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:29.428 [2024-11-29 05:33:40.481108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.428 #28 NEW cov: 11847 ft: 13732 corp: 9/364b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeByte- 00:08:29.428 [2024-11-29 05:33:40.520866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.428 [2024-11-29 05:33:40.520894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.520946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.428 [2024-11-29 05:33:40.520963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.521022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.428 [2024-11-29 05:33:40.521037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.521097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.428 [2024-11-29 05:33:40.521112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.521171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:29.428 [2024-11-29 05:33:40.521186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.428 #29 NEW cov: 11847 ft: 13798 corp: 10/414b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 CopyPart- 00:08:29.428 [2024-11-29 05:33:40.560525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.428 [2024-11-29 05:33:40.560552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.560612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.428 [2024-11-29 05:33:40.560628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.428 #30 NEW cov: 11847 ft: 13829 corp: 11/435b lim: 50 exec/s: 0 rss: 67Mb L: 21/50 MS: 1 ChangeBinInt- 00:08:29.428 [2024-11-29 05:33:40.600959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.428 [2024-11-29 05:33:40.600987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.601028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.428 [2024-11-29 05:33:40.601043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.601099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.428 [2024-11-29 05:33:40.601114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.601169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.428 [2024-11-29 05:33:40.601185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.428 #31 NEW cov: 11847 ft: 13837 corp: 12/483b lim: 50 exec/s: 0 rss: 67Mb L: 48/50 MS: 1 CopyPart- 00:08:29.428 [2024-11-29 05:33:40.641258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.428 [2024-11-29 05:33:40.641285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.641337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.428 [2024-11-29 05:33:40.641353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.641407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.428 [2024-11-29 05:33:40.641423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.641477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.428 [2024-11-29 05:33:40.641491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.641552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:29.428 [2024-11-29 05:33:40.641567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.428 #32 NEW cov: 11847 ft: 13909 corp: 13/533b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 InsertByte- 00:08:29.428 [2024-11-29 05:33:40.681332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.428 [2024-11-29 05:33:40.681359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.681412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.428 [2024-11-29 05:33:40.681428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.681485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.428 [2024-11-29 05:33:40.681501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.681560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.428 [2024-11-29 05:33:40.681575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.681635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:29.428 [2024-11-29 05:33:40.681650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.428 #33 NEW cov: 11847 ft: 13953 corp: 14/583b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ChangeByte- 00:08:29.428 [2024-11-29 05:33:40.721435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.428 [2024-11-29 05:33:40.721463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.721514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.428 [2024-11-29 05:33:40.721530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.721588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.428 [2024-11-29 05:33:40.721608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.428 [2024-11-29 05:33:40.721665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.428 [2024-11-29 05:33:40.721680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.429 [2024-11-29 05:33:40.721740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:29.429 [2024-11-29 05:33:40.721755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.688 #34 NEW cov: 11847 ft: 13994 corp: 15/633b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:29.688 [2024-11-29 05:33:40.761127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.688 [2024-11-29 05:33:40.761153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.688 [2024-11-29 05:33:40.761208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.688 [2024-11-29 05:33:40.761223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.688 #35 NEW cov: 11847 ft: 14027 corp: 16/654b lim: 50 exec/s: 0 rss: 68Mb L: 21/50 MS: 1 ChangeBit- 00:08:29.688 [2024-11-29 05:33:40.801701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.688 [2024-11-29 05:33:40.801729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.688 [2024-11-29 05:33:40.801793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.688 [2024-11-29 05:33:40.801810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.688 [2024-11-29 05:33:40.801866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.688 [2024-11-29 05:33:40.801882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.688 [2024-11-29 05:33:40.801938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.688 [2024-11-29 05:33:40.801953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.689 [2024-11-29 05:33:40.802008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:29.689 [2024-11-29 05:33:40.802024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.689 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:29.689 #36 NEW cov: 11870 ft: 14066 corp: 17/704b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:29.689 [2024-11-29 05:33:40.841516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.689 [2024-11-29 05:33:40.841544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.689 [2024-11-29 05:33:40.841603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.689 [2024-11-29 05:33:40.841619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.689 [2024-11-29 05:33:40.841679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.689 [2024-11-29 05:33:40.841696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.689 #37 NEW cov: 11870 ft: 14348 corp: 18/734b lim: 50 exec/s: 0 rss: 68Mb L: 30/50 MS: 1 EraseBytes- 00:08:29.689 [2024-11-29 05:33:40.881955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.689 [2024-11-29 05:33:40.881983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.689 [2024-11-29 05:33:40.882049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.689 [2024-11-29 05:33:40.882066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.689 [2024-11-29 05:33:40.882124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.689 [2024-11-29 05:33:40.882140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.689 [2024-11-29 05:33:40.882198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.689 [2024-11-29 05:33:40.882214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.689 [2024-11-29 05:33:40.882271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:29.689 [2024-11-29 05:33:40.882290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.689 #38 NEW cov: 11870 ft: 14355 corp: 19/784b lim: 50 exec/s: 38 rss: 68Mb L: 50/50 MS: 1 ChangeByte- 00:08:29.689 [2024-11-29 05:33:40.921756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.689 [2024-11-29 05:33:40.921784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.689 [2024-11-29 05:33:40.921822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.689 [2024-11-29 05:33:40.921837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.689 [2024-11-29 05:33:40.921893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.689 [2024-11-29 05:33:40.921908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.689 #39 NEW cov: 11870 ft: 14387 corp: 20/821b lim: 50 exec/s: 39 rss: 68Mb L: 37/50 MS: 1 CrossOver- 00:08:29.689 [2024-11-29 05:33:40.961882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.689 [2024-11-29 05:33:40.961909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.689 [2024-11-29 05:33:40.961964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.689 [2024-11-29 05:33:40.961981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.689 [2024-11-29 05:33:40.962035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.689 [2024-11-29 05:33:40.962050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.689 #40 NEW cov: 11870 ft: 14399 corp: 21/857b lim: 50 exec/s: 40 rss: 68Mb L: 36/50 MS: 1 EraseBytes- 00:08:29.949 [2024-11-29 05:33:41.001817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.949 [2024-11-29 05:33:41.001844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.001902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.949 [2024-11-29 05:33:41.001918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.949 #41 NEW cov: 11870 ft: 14420 corp: 22/878b lim: 50 exec/s: 41 rss: 68Mb L: 21/50 MS: 1 ChangeBit- 00:08:29.949 [2024-11-29 05:33:41.042364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.949 [2024-11-29 05:33:41.042391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.042444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.949 [2024-11-29 05:33:41.042460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.042533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.949 [2024-11-29 05:33:41.042549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.042607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.949 [2024-11-29 05:33:41.042624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.042686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:29.949 [2024-11-29 05:33:41.042711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.949 #42 NEW cov: 11870 ft: 14432 corp: 23/928b lim: 50 exec/s: 42 rss: 68Mb L: 50/50 MS: 1 ChangeByte- 00:08:29.949 [2024-11-29 05:33:41.082474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.949 [2024-11-29 05:33:41.082501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.082555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.949 [2024-11-29 05:33:41.082572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.082637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:29.949 [2024-11-29 05:33:41.082652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.082709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:29.949 [2024-11-29 05:33:41.082725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.082784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:29.949 [2024-11-29 05:33:41.082801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.949 #43 NEW cov: 11870 ft: 14446 corp: 24/978b lim: 50 exec/s: 43 rss: 68Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:29.949 [2024-11-29 05:33:41.122150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.949 [2024-11-29 05:33:41.122177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.122238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.949 [2024-11-29 05:33:41.122254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.949 #44 NEW cov: 11870 ft: 14467 corp: 25/1006b lim: 50 exec/s: 44 rss: 68Mb L: 28/50 MS: 1 CrossOver- 00:08:29.949 [2024-11-29 05:33:41.162253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.949 [2024-11-29 05:33:41.162281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.162347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.949 [2024-11-29 05:33:41.162364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.949 #45 NEW cov: 11870 ft: 14487 corp: 26/1031b lim: 50 exec/s: 45 rss: 68Mb L: 25/50 MS: 1 EraseBytes- 00:08:29.949 [2024-11-29 05:33:41.202350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.949 [2024-11-29 05:33:41.202378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.202420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.949 [2024-11-29 05:33:41.202436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.949 #46 NEW cov: 11870 ft: 14505 corp: 27/1057b lim: 50 exec/s: 46 rss: 69Mb L: 26/50 MS: 1 InsertByte- 00:08:29.949 [2024-11-29 05:33:41.242472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:29.949 [2024-11-29 05:33:41.242500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.949 [2024-11-29 05:33:41.242542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:29.949 [2024-11-29 05:33:41.242558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.209 #52 NEW cov: 11870 ft: 14528 corp: 28/1083b lim: 50 exec/s: 52 rss: 69Mb L: 26/50 MS: 1 ShuffleBytes- 00:08:30.209 [2024-11-29 05:33:41.282915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.209 [2024-11-29 05:33:41.282943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.283006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.209 [2024-11-29 05:33:41.283022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.283079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.209 [2024-11-29 05:33:41.283094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.283151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.209 [2024-11-29 05:33:41.283167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.209 #53 NEW cov: 11870 ft: 14549 corp: 29/1132b lim: 50 exec/s: 53 rss: 69Mb L: 49/50 MS: 1 ShuffleBytes- 00:08:30.209 [2024-11-29 05:33:41.323060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.209 [2024-11-29 05:33:41.323087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.323137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.209 [2024-11-29 05:33:41.323153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.323210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.209 [2024-11-29 05:33:41.323242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.323298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.209 [2024-11-29 05:33:41.323314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.209 #54 NEW cov: 11870 ft: 14645 corp: 30/1180b lim: 50 exec/s: 54 rss: 69Mb L: 48/50 MS: 1 EraseBytes- 00:08:30.209 [2024-11-29 05:33:41.362827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.209 [2024-11-29 05:33:41.362855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.362899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.209 [2024-11-29 05:33:41.362915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.209 #55 NEW cov: 11870 ft: 14654 corp: 31/1206b lim: 50 exec/s: 55 rss: 69Mb L: 26/50 MS: 1 InsertByte- 00:08:30.209 [2024-11-29 05:33:41.403262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.209 [2024-11-29 05:33:41.403291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.403332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.209 [2024-11-29 05:33:41.403348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.403406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.209 [2024-11-29 05:33:41.403421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.403478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.209 [2024-11-29 05:33:41.403495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.209 #56 NEW cov: 11870 ft: 14697 corp: 32/1255b lim: 50 exec/s: 56 rss: 69Mb L: 49/50 MS: 1 CopyPart- 00:08:30.209 [2024-11-29 05:33:41.443609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.209 [2024-11-29 05:33:41.443637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.443690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.209 [2024-11-29 05:33:41.443706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.443764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.209 [2024-11-29 05:33:41.443781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.443848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.209 [2024-11-29 05:33:41.443865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.443921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:30.209 [2024-11-29 05:33:41.443936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.209 #57 NEW cov: 11870 ft: 14734 corp: 33/1305b lim: 50 exec/s: 57 rss: 69Mb L: 50/50 MS: 1 InsertByte- 00:08:30.209 [2024-11-29 05:33:41.483358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.209 [2024-11-29 05:33:41.483387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.483425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.209 [2024-11-29 05:33:41.483441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.209 [2024-11-29 05:33:41.483498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.209 [2024-11-29 05:33:41.483514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.209 #58 NEW cov: 11870 ft: 14790 corp: 34/1336b lim: 50 exec/s: 58 rss: 69Mb L: 31/50 MS: 1 EraseBytes- 00:08:30.469 [2024-11-29 05:33:41.523653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.469 [2024-11-29 05:33:41.523680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.523746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.469 [2024-11-29 05:33:41.523765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.523820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.469 [2024-11-29 05:33:41.523836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.523891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.469 [2024-11-29 05:33:41.523907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.469 #59 NEW cov: 11870 ft: 14802 corp: 35/1383b lim: 50 exec/s: 59 rss: 69Mb L: 47/50 MS: 1 ShuffleBytes- 00:08:30.469 [2024-11-29 05:33:41.563792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.469 [2024-11-29 05:33:41.563820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.563887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.469 [2024-11-29 05:33:41.563903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.563957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.469 [2024-11-29 05:33:41.563972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.564031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.469 [2024-11-29 05:33:41.564047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.469 #60 NEW cov: 11870 ft: 14818 corp: 36/1431b lim: 50 exec/s: 60 rss: 69Mb L: 48/50 MS: 1 ChangeBit- 00:08:30.469 [2024-11-29 05:33:41.604079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.469 [2024-11-29 05:33:41.604106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.604177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.469 [2024-11-29 05:33:41.604194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.604252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.469 [2024-11-29 05:33:41.604266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.604323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.469 [2024-11-29 05:33:41.604339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.604399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:30.469 [2024-11-29 05:33:41.604413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.469 #66 NEW cov: 11870 ft: 14825 corp: 37/1481b lim: 50 exec/s: 66 rss: 69Mb L: 50/50 MS: 1 CopyPart- 00:08:30.469 [2024-11-29 05:33:41.644166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.469 [2024-11-29 05:33:41.644193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.644244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.469 [2024-11-29 05:33:41.644260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.644314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.469 [2024-11-29 05:33:41.644329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.469 [2024-11-29 05:33:41.644386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.469 [2024-11-29 05:33:41.644402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.470 [2024-11-29 05:33:41.644458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:30.470 [2024-11-29 05:33:41.644475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.470 #67 NEW cov: 11870 ft: 14873 corp: 38/1531b lim: 50 exec/s: 67 rss: 69Mb L: 50/50 MS: 1 ChangeByte- 00:08:30.470 [2024-11-29 05:33:41.683778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.470 [2024-11-29 05:33:41.683805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.470 [2024-11-29 05:33:41.683845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.470 [2024-11-29 05:33:41.683860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.470 #68 NEW cov: 11870 ft: 14883 corp: 39/1557b lim: 50 exec/s: 68 rss: 69Mb L: 26/50 MS: 1 CrossOver- 00:08:30.470 [2024-11-29 05:33:41.724195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.470 [2024-11-29 05:33:41.724223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.470 [2024-11-29 05:33:41.724268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.470 [2024-11-29 05:33:41.724284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.470 [2024-11-29 05:33:41.724342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.470 [2024-11-29 05:33:41.724356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.470 [2024-11-29 05:33:41.724412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.470 [2024-11-29 05:33:41.724427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.470 #69 NEW cov: 11870 ft: 14893 corp: 40/1604b lim: 50 exec/s: 69 rss: 69Mb L: 47/50 MS: 1 ChangeByte- 00:08:30.470 [2024-11-29 05:33:41.764304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.470 [2024-11-29 05:33:41.764331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.470 [2024-11-29 05:33:41.764392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.470 [2024-11-29 05:33:41.764408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.470 [2024-11-29 05:33:41.764463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.470 [2024-11-29 05:33:41.764479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.470 [2024-11-29 05:33:41.764539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.470 [2024-11-29 05:33:41.764554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.730 #70 NEW cov: 11870 ft: 14903 corp: 41/1652b lim: 50 exec/s: 70 rss: 69Mb L: 48/50 MS: 1 InsertRepeatedBytes- 00:08:30.730 [2024-11-29 05:33:41.804398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.730 [2024-11-29 05:33:41.804426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.730 [2024-11-29 05:33:41.804475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.730 [2024-11-29 05:33:41.804491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.730 [2024-11-29 05:33:41.804544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.730 [2024-11-29 05:33:41.804560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.730 [2024-11-29 05:33:41.804621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.730 [2024-11-29 05:33:41.804654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.730 #71 NEW cov: 11870 ft: 14916 corp: 42/1700b lim: 50 exec/s: 71 rss: 69Mb L: 48/50 MS: 1 ChangeBinInt- 00:08:30.730 [2024-11-29 05:33:41.844383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.730 [2024-11-29 05:33:41.844410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.730 [2024-11-29 05:33:41.844453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.730 [2024-11-29 05:33:41.844468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.730 [2024-11-29 05:33:41.844525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.730 [2024-11-29 05:33:41.844540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.730 #72 NEW cov: 11870 ft: 14918 corp: 43/1737b lim: 50 exec/s: 72 rss: 69Mb L: 37/50 MS: 1 ChangeByte- 00:08:30.731 [2024-11-29 05:33:41.884631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.731 [2024-11-29 05:33:41.884658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.731 [2024-11-29 05:33:41.884724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.731 [2024-11-29 05:33:41.884740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.731 [2024-11-29 05:33:41.884797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.731 [2024-11-29 05:33:41.884813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.731 [2024-11-29 05:33:41.884872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.731 [2024-11-29 05:33:41.884888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.731 #73 NEW cov: 11870 ft: 14919 corp: 44/1785b lim: 50 exec/s: 36 rss: 69Mb L: 48/50 MS: 1 ChangeBit- 00:08:30.731 #73 DONE cov: 11870 ft: 14919 corp: 44/1785b lim: 50 exec/s: 36 rss: 69Mb 00:08:30.731 Done 73 runs in 2 second(s) 00:08:30.731 05:33:42 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:30.731 05:33:42 -- ../common.sh@72 -- # (( i++ )) 00:08:30.731 05:33:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.731 05:33:42 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:30.731 05:33:42 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:30.731 05:33:42 -- nvmf/run.sh@24 -- # local timen=1 00:08:30.731 05:33:42 -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.731 05:33:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:30.731 05:33:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:30.731 05:33:42 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:30.731 05:33:42 -- nvmf/run.sh@29 -- # port=4422 00:08:30.731 05:33:42 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:30.990 05:33:42 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:30.990 05:33:42 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.990 05:33:42 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:30.990 [2024-11-29 05:33:42.066383] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:30.990 [2024-11-29 05:33:42.066452] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2223388 ] 00:08:30.990 EAL: No free 2048 kB hugepages reported on node 1 00:08:30.990 [2024-11-29 05:33:42.240948] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.990 [2024-11-29 05:33:42.260763] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:30.990 [2024-11-29 05:33:42.260902] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.250 [2024-11-29 05:33:42.312336] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:31.250 [2024-11-29 05:33:42.328680] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:31.250 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.250 INFO: Seed: 2771623057 00:08:31.250 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:31.250 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:31.250 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:31.250 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.250 #2 INITED exec/s: 0 rss: 59Mb 00:08:31.250 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.250 This may also happen if the target rejected all inputs we tried so far 00:08:31.250 [2024-11-29 05:33:42.373710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.250 [2024-11-29 05:33:42.373741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.509 NEW_FUNC[1/672]: 0x478a98 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:31.510 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:31.510 #10 NEW cov: 11669 ft: 11670 corp: 2/19b lim: 85 exec/s: 0 rss: 67Mb L: 18/18 MS: 3 InsertRepeatedBytes-CopyPart-CopyPart- 00:08:31.510 [2024-11-29 05:33:42.674465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.510 [2024-11-29 05:33:42.674497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.510 #11 NEW cov: 11782 ft: 12098 corp: 3/37b lim: 85 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 ChangeBinInt- 00:08:31.510 [2024-11-29 05:33:42.714534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.510 [2024-11-29 05:33:42.714562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.510 #12 NEW cov: 11788 ft: 12485 corp: 4/56b lim: 85 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 InsertByte- 00:08:31.510 [2024-11-29 05:33:42.754596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.510 [2024-11-29 05:33:42.754628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.510 #14 NEW cov: 11873 ft: 12738 corp: 5/73b lim: 85 exec/s: 0 rss: 67Mb L: 17/19 MS: 2 ChangeBinInt-CrossOver- 00:08:31.510 [2024-11-29 05:33:42.794688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.510 [2024-11-29 05:33:42.794715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.769 #15 NEW cov: 11873 ft: 12880 corp: 6/97b lim: 85 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 CopyPart- 00:08:31.769 [2024-11-29 05:33:42.834779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.770 [2024-11-29 05:33:42.834806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.770 #16 NEW cov: 11873 ft: 13022 corp: 7/115b lim: 85 exec/s: 0 rss: 67Mb L: 18/24 MS: 1 CrossOver- 00:08:31.770 [2024-11-29 05:33:42.874907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.770 [2024-11-29 05:33:42.874934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.770 #22 NEW cov: 11873 ft: 13120 corp: 8/133b lim: 85 exec/s: 0 rss: 67Mb L: 18/24 MS: 1 ChangeBinInt- 00:08:31.770 [2024-11-29 05:33:42.915351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.770 [2024-11-29 05:33:42.915379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.770 [2024-11-29 05:33:42.915422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.770 [2024-11-29 05:33:42.915438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.770 [2024-11-29 05:33:42.915493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:31.770 [2024-11-29 05:33:42.915508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.770 #26 NEW cov: 11873 ft: 14030 corp: 9/191b lim: 85 exec/s: 0 rss: 67Mb L: 58/58 MS: 4 CopyPart-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:31.770 [2024-11-29 05:33:42.955300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.770 [2024-11-29 05:33:42.955326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.770 [2024-11-29 05:33:42.955363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.770 [2024-11-29 05:33:42.955379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.770 #30 NEW cov: 11873 ft: 14368 corp: 10/228b lim: 85 exec/s: 0 rss: 67Mb L: 37/58 MS: 4 EraseBytes-ChangeBinInt-ShuffleBytes-InsertRepeatedBytes- 00:08:31.770 [2024-11-29 05:33:43.005318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.770 [2024-11-29 05:33:43.005345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.770 #31 NEW cov: 11873 ft: 14418 corp: 11/246b lim: 85 exec/s: 0 rss: 68Mb L: 18/58 MS: 1 ChangeBit- 00:08:31.770 [2024-11-29 05:33:43.045731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:31.770 [2024-11-29 05:33:43.045758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.770 [2024-11-29 05:33:43.045795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:31.770 [2024-11-29 05:33:43.045810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.770 [2024-11-29 05:33:43.045851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:31.770 [2024-11-29 05:33:43.045866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.029 #32 NEW cov: 11873 ft: 14452 corp: 12/304b lim: 85 exec/s: 0 rss: 68Mb L: 58/58 MS: 1 ChangeBit- 00:08:32.029 [2024-11-29 05:33:43.095556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.029 [2024-11-29 05:33:43.095583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.029 #38 NEW cov: 11873 ft: 14498 corp: 13/321b lim: 85 exec/s: 0 rss: 68Mb L: 17/58 MS: 1 ShuffleBytes- 00:08:32.029 [2024-11-29 05:33:43.135664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.029 [2024-11-29 05:33:43.135690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.029 #39 NEW cov: 11873 ft: 14535 corp: 14/339b lim: 85 exec/s: 0 rss: 68Mb L: 18/58 MS: 1 ChangeByte- 00:08:32.029 [2024-11-29 05:33:43.176096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.030 [2024-11-29 05:33:43.176122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.030 [2024-11-29 05:33:43.176177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.030 [2024-11-29 05:33:43.176193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.030 [2024-11-29 05:33:43.176248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.030 [2024-11-29 05:33:43.176263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.030 #40 NEW cov: 11873 ft: 14602 corp: 15/394b lim: 85 exec/s: 0 rss: 68Mb L: 55/58 MS: 1 CopyPart- 00:08:32.030 [2024-11-29 05:33:43.216023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.030 [2024-11-29 05:33:43.216053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.030 [2024-11-29 05:33:43.216117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.030 [2024-11-29 05:33:43.216134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.030 #41 NEW cov: 11873 ft: 14624 corp: 16/428b lim: 85 exec/s: 0 rss: 68Mb L: 34/58 MS: 1 CopyPart- 00:08:32.030 [2024-11-29 05:33:43.256456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.030 [2024-11-29 05:33:43.256484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.030 [2024-11-29 05:33:43.256546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.030 [2024-11-29 05:33:43.256562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.030 [2024-11-29 05:33:43.256626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.030 [2024-11-29 05:33:43.256642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.030 [2024-11-29 05:33:43.256696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:32.030 [2024-11-29 05:33:43.256710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.030 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:32.030 #42 NEW cov: 11896 ft: 15061 corp: 17/511b lim: 85 exec/s: 0 rss: 68Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:08:32.030 [2024-11-29 05:33:43.306219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.030 [2024-11-29 05:33:43.306247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.289 #48 NEW cov: 11896 ft: 15105 corp: 18/529b lim: 85 exec/s: 0 rss: 68Mb L: 18/83 MS: 1 ChangeBit- 00:08:32.289 [2024-11-29 05:33:43.346282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.289 [2024-11-29 05:33:43.346309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.289 #49 NEW cov: 11896 ft: 15152 corp: 19/547b lim: 85 exec/s: 49 rss: 68Mb L: 18/83 MS: 1 ChangeBit- 00:08:32.289 [2024-11-29 05:33:43.386528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.289 [2024-11-29 05:33:43.386557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.289 [2024-11-29 05:33:43.386610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.289 [2024-11-29 05:33:43.386627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.289 #50 NEW cov: 11896 ft: 15200 corp: 20/586b lim: 85 exec/s: 50 rss: 68Mb L: 39/83 MS: 1 EraseBytes- 00:08:32.289 [2024-11-29 05:33:43.427151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.289 [2024-11-29 05:33:43.427177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.289 [2024-11-29 05:33:43.427232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.289 [2024-11-29 05:33:43.427246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.289 [2024-11-29 05:33:43.427297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.289 [2024-11-29 05:33:43.427312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.289 [2024-11-29 05:33:43.427364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:32.289 [2024-11-29 05:33:43.427379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.289 [2024-11-29 05:33:43.427432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:32.289 [2024-11-29 05:33:43.427447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:32.289 #51 NEW cov: 11896 ft: 15336 corp: 21/671b lim: 85 exec/s: 51 rss: 68Mb L: 85/85 MS: 1 CopyPart- 00:08:32.289 [2024-11-29 05:33:43.466935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.289 [2024-11-29 05:33:43.466965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.289 [2024-11-29 05:33:43.467007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.289 [2024-11-29 05:33:43.467022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.289 [2024-11-29 05:33:43.467074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.289 [2024-11-29 05:33:43.467106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.289 #52 NEW cov: 11896 ft: 15411 corp: 22/729b lim: 85 exec/s: 52 rss: 68Mb L: 58/85 MS: 1 ShuffleBytes- 00:08:32.289 [2024-11-29 05:33:43.516802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.289 [2024-11-29 05:33:43.516830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.289 #53 NEW cov: 11896 ft: 15422 corp: 23/747b lim: 85 exec/s: 53 rss: 68Mb L: 18/85 MS: 1 ChangeBit- 00:08:32.289 [2024-11-29 05:33:43.557042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.289 [2024-11-29 05:33:43.557069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.289 [2024-11-29 05:33:43.557123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.289 [2024-11-29 05:33:43.557138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.289 #56 NEW cov: 11896 ft: 15462 corp: 24/792b lim: 85 exec/s: 56 rss: 68Mb L: 45/85 MS: 3 EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:32.547 [2024-11-29 05:33:43.597168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.547 [2024-11-29 05:33:43.597195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.547 [2024-11-29 05:33:43.597246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.547 [2024-11-29 05:33:43.597263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.547 #57 NEW cov: 11896 ft: 15477 corp: 25/826b lim: 85 exec/s: 57 rss: 69Mb L: 34/85 MS: 1 CopyPart- 00:08:32.547 [2024-11-29 05:33:43.637230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.547 [2024-11-29 05:33:43.637257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.547 [2024-11-29 05:33:43.637298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.547 [2024-11-29 05:33:43.637313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.547 #58 NEW cov: 11896 ft: 15495 corp: 26/864b lim: 85 exec/s: 58 rss: 69Mb L: 38/85 MS: 1 InsertRepeatedBytes- 00:08:32.547 [2024-11-29 05:33:43.677518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.547 [2024-11-29 05:33:43.677546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.547 [2024-11-29 05:33:43.677585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.547 [2024-11-29 05:33:43.677604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.547 [2024-11-29 05:33:43.677658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.547 [2024-11-29 05:33:43.677678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.547 #59 NEW cov: 11896 ft: 15511 corp: 27/919b lim: 85 exec/s: 59 rss: 69Mb L: 55/85 MS: 1 ShuffleBytes- 00:08:32.547 [2024-11-29 05:33:43.717357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.547 [2024-11-29 05:33:43.717383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.547 #61 NEW cov: 11896 ft: 15583 corp: 28/936b lim: 85 exec/s: 61 rss: 69Mb L: 17/85 MS: 2 InsertByte-CrossOver- 00:08:32.547 [2024-11-29 05:33:43.757608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.547 [2024-11-29 05:33:43.757635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.547 [2024-11-29 05:33:43.757673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.547 [2024-11-29 05:33:43.757687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.547 #62 NEW cov: 11896 ft: 15600 corp: 29/973b lim: 85 exec/s: 62 rss: 69Mb L: 37/85 MS: 1 InsertRepeatedBytes- 00:08:32.547 [2024-11-29 05:33:43.797845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.547 [2024-11-29 05:33:43.797873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.547 [2024-11-29 05:33:43.797915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.547 [2024-11-29 05:33:43.797928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.547 [2024-11-29 05:33:43.797982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.547 [2024-11-29 05:33:43.797997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.547 #63 NEW cov: 11896 ft: 15604 corp: 30/1029b lim: 85 exec/s: 63 rss: 69Mb L: 56/85 MS: 1 InsertRepeatedBytes- 00:08:32.547 [2024-11-29 05:33:43.837668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.547 [2024-11-29 05:33:43.837695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.807 #64 NEW cov: 11896 ft: 15613 corp: 31/1047b lim: 85 exec/s: 64 rss: 69Mb L: 18/85 MS: 1 ChangeBit- 00:08:32.807 [2024-11-29 05:33:43.877798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.807 [2024-11-29 05:33:43.877825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.807 #65 NEW cov: 11896 ft: 15647 corp: 32/1065b lim: 85 exec/s: 65 rss: 69Mb L: 18/85 MS: 1 ChangeBit- 00:08:32.807 [2024-11-29 05:33:43.918220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.807 [2024-11-29 05:33:43.918247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.807 [2024-11-29 05:33:43.918288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.807 [2024-11-29 05:33:43.918304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.807 [2024-11-29 05:33:43.918354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.807 [2024-11-29 05:33:43.918369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.807 #66 NEW cov: 11896 ft: 15668 corp: 33/1130b lim: 85 exec/s: 66 rss: 69Mb L: 65/85 MS: 1 InsertRepeatedBytes- 00:08:32.807 [2024-11-29 05:33:43.958011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.807 [2024-11-29 05:33:43.958038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.807 #67 NEW cov: 11896 ft: 15679 corp: 34/1148b lim: 85 exec/s: 67 rss: 69Mb L: 18/85 MS: 1 ChangeByte- 00:08:32.807 [2024-11-29 05:33:43.998135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.807 [2024-11-29 05:33:43.998162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.807 #68 NEW cov: 11896 ft: 15706 corp: 35/1165b lim: 85 exec/s: 68 rss: 69Mb L: 17/85 MS: 1 ChangeByte- 00:08:32.807 [2024-11-29 05:33:44.038868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.807 [2024-11-29 05:33:44.038896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.807 [2024-11-29 05:33:44.038966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.807 [2024-11-29 05:33:44.038982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.807 [2024-11-29 05:33:44.039034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.807 [2024-11-29 05:33:44.039050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.807 [2024-11-29 05:33:44.039103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:32.807 [2024-11-29 05:33:44.039117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.807 [2024-11-29 05:33:44.039170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:32.807 [2024-11-29 05:33:44.039185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:32.807 #69 NEW cov: 11896 ft: 15787 corp: 36/1250b lim: 85 exec/s: 69 rss: 69Mb L: 85/85 MS: 1 ChangeByte- 00:08:32.807 [2024-11-29 05:33:44.088531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.807 [2024-11-29 05:33:44.088558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.807 [2024-11-29 05:33:44.088604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.807 [2024-11-29 05:33:44.088619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.067 #70 NEW cov: 11896 ft: 15828 corp: 37/1294b lim: 85 exec/s: 70 rss: 69Mb L: 44/85 MS: 1 CrossOver- 00:08:33.067 [2024-11-29 05:33:44.138557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.067 [2024-11-29 05:33:44.138584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.067 #71 NEW cov: 11896 ft: 15859 corp: 38/1312b lim: 85 exec/s: 71 rss: 69Mb L: 18/85 MS: 1 ShuffleBytes- 00:08:33.067 [2024-11-29 05:33:44.178641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.067 [2024-11-29 05:33:44.178669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.067 #72 NEW cov: 11896 ft: 15889 corp: 39/1331b lim: 85 exec/s: 72 rss: 69Mb L: 19/85 MS: 1 InsertByte- 00:08:33.067 [2024-11-29 05:33:44.218766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.067 [2024-11-29 05:33:44.218798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.067 #73 NEW cov: 11896 ft: 15894 corp: 40/1349b lim: 85 exec/s: 73 rss: 69Mb L: 18/85 MS: 1 ChangeByte- 00:08:33.067 [2024-11-29 05:33:44.259484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.067 [2024-11-29 05:33:44.259511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.067 [2024-11-29 05:33:44.259560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.067 [2024-11-29 05:33:44.259574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.067 [2024-11-29 05:33:44.259646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.067 [2024-11-29 05:33:44.259663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.067 [2024-11-29 05:33:44.259716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:33.067 [2024-11-29 05:33:44.259731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.067 [2024-11-29 05:33:44.259781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:33.067 [2024-11-29 05:33:44.259796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:33.067 #74 NEW cov: 11896 ft: 15909 corp: 41/1434b lim: 85 exec/s: 74 rss: 70Mb L: 85/85 MS: 1 ShuffleBytes- 00:08:33.067 [2024-11-29 05:33:44.309190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.067 [2024-11-29 05:33:44.309217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.067 [2024-11-29 05:33:44.309252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.067 [2024-11-29 05:33:44.309267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.067 #75 NEW cov: 11896 ft: 15911 corp: 42/1480b lim: 85 exec/s: 75 rss: 70Mb L: 46/85 MS: 1 CrossOver- 00:08:33.067 [2024-11-29 05:33:44.349157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.067 [2024-11-29 05:33:44.349184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.067 #76 NEW cov: 11896 ft: 15923 corp: 43/1502b lim: 85 exec/s: 38 rss: 70Mb L: 22/85 MS: 1 CopyPart- 00:08:33.067 #76 DONE cov: 11896 ft: 15923 corp: 43/1502b lim: 85 exec/s: 38 rss: 70Mb 00:08:33.067 Done 76 runs in 2 second(s) 00:08:33.327 05:33:44 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:33.327 05:33:44 -- ../common.sh@72 -- # (( i++ )) 00:08:33.327 05:33:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:33.327 05:33:44 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:33.327 05:33:44 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:33.327 05:33:44 -- nvmf/run.sh@24 -- # local timen=1 00:08:33.327 05:33:44 -- nvmf/run.sh@25 -- # local core=0x1 00:08:33.327 05:33:44 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:33.327 05:33:44 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:33.327 05:33:44 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:33.327 05:33:44 -- nvmf/run.sh@29 -- # port=4423 00:08:33.327 05:33:44 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:33.327 05:33:44 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:33.327 05:33:44 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:33.327 05:33:44 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:33.327 [2024-11-29 05:33:44.524419] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:33.327 [2024-11-29 05:33:44.524490] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2223928 ] 00:08:33.327 EAL: No free 2048 kB hugepages reported on node 1 00:08:33.587 [2024-11-29 05:33:44.698962] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.587 [2024-11-29 05:33:44.718252] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:33.587 [2024-11-29 05:33:44.718388] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.587 [2024-11-29 05:33:44.769738] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.587 [2024-11-29 05:33:44.786102] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:33.587 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.587 INFO: Seed: 935634339 00:08:33.587 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:33.587 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:33.587 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:33.587 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.587 #2 INITED exec/s: 0 rss: 60Mb 00:08:33.587 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.587 This may also happen if the target rejected all inputs we tried so far 00:08:33.587 [2024-11-29 05:33:44.841535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.587 [2024-11-29 05:33:44.841565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.587 [2024-11-29 05:33:44.841611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.587 [2024-11-29 05:33:44.841627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.587 [2024-11-29 05:33:44.841682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.587 [2024-11-29 05:33:44.841698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.587 [2024-11-29 05:33:44.841752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.587 [2024-11-29 05:33:44.841767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.847 NEW_FUNC[1/671]: 0x47bcd8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:33.847 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:33.847 #19 NEW cov: 11602 ft: 11603 corp: 2/25b lim: 25 exec/s: 0 rss: 67Mb L: 24/24 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:33.847 [2024-11-29 05:33:45.132146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.847 [2024-11-29 05:33:45.132178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.847 [2024-11-29 05:33:45.132217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.847 [2024-11-29 05:33:45.132236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.847 [2024-11-29 05:33:45.132287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.847 [2024-11-29 05:33:45.132303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.847 [2024-11-29 05:33:45.132356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.847 [2024-11-29 05:33:45.132371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.106 #20 NEW cov: 11715 ft: 12134 corp: 3/48b lim: 25 exec/s: 0 rss: 67Mb L: 23/24 MS: 1 CrossOver- 00:08:34.106 [2024-11-29 05:33:45.172222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.106 [2024-11-29 05:33:45.172249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.172298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.106 [2024-11-29 05:33:45.172314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.172367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.106 [2024-11-29 05:33:45.172381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.172436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.106 [2024-11-29 05:33:45.172450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.106 #21 NEW cov: 11721 ft: 12387 corp: 4/72b lim: 25 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 ChangeBinInt- 00:08:34.106 [2024-11-29 05:33:45.212391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.106 [2024-11-29 05:33:45.212418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.212467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.106 [2024-11-29 05:33:45.212483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.212535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.106 [2024-11-29 05:33:45.212549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.212604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.106 [2024-11-29 05:33:45.212619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.106 #22 NEW cov: 11806 ft: 12675 corp: 5/96b lim: 25 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 ChangeByte- 00:08:34.106 [2024-11-29 05:33:45.252485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.106 [2024-11-29 05:33:45.252512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.252578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.106 [2024-11-29 05:33:45.252594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.252655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.106 [2024-11-29 05:33:45.252674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.252729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.106 [2024-11-29 05:33:45.252745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.106 #23 NEW cov: 11806 ft: 12734 corp: 6/120b lim: 25 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 InsertByte- 00:08:34.106 [2024-11-29 05:33:45.292437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.106 [2024-11-29 05:33:45.292464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.292512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.106 [2024-11-29 05:33:45.292528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.292583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.106 [2024-11-29 05:33:45.292603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.106 #24 NEW cov: 11806 ft: 13255 corp: 7/137b lim: 25 exec/s: 0 rss: 67Mb L: 17/24 MS: 1 EraseBytes- 00:08:34.106 [2024-11-29 05:33:45.332680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.106 [2024-11-29 05:33:45.332707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.332776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.106 [2024-11-29 05:33:45.332791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.332844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.106 [2024-11-29 05:33:45.332859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.332915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.106 [2024-11-29 05:33:45.332930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.106 #30 NEW cov: 11806 ft: 13296 corp: 8/161b lim: 25 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 CopyPart- 00:08:34.106 [2024-11-29 05:33:45.372959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.106 [2024-11-29 05:33:45.372987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.373045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.106 [2024-11-29 05:33:45.373059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.373113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.106 [2024-11-29 05:33:45.373128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.373180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.106 [2024-11-29 05:33:45.373195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.106 [2024-11-29 05:33:45.373249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:34.106 [2024-11-29 05:33:45.373268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:34.106 #31 NEW cov: 11806 ft: 13379 corp: 9/186b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 InsertByte- 00:08:34.366 [2024-11-29 05:33:45.413056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.366 [2024-11-29 05:33:45.413083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.366 [2024-11-29 05:33:45.413139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.366 [2024-11-29 05:33:45.413155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.366 [2024-11-29 05:33:45.413208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.366 [2024-11-29 05:33:45.413239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.366 [2024-11-29 05:33:45.413296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.367 [2024-11-29 05:33:45.413311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.413366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:34.367 [2024-11-29 05:33:45.413382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:34.367 #32 NEW cov: 11806 ft: 13483 corp: 10/211b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 InsertByte- 00:08:34.367 [2024-11-29 05:33:45.453086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.367 [2024-11-29 05:33:45.453114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.453166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.367 [2024-11-29 05:33:45.453181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.453236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.367 [2024-11-29 05:33:45.453250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.453307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.367 [2024-11-29 05:33:45.453321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.367 #33 NEW cov: 11806 ft: 13522 corp: 11/235b lim: 25 exec/s: 0 rss: 68Mb L: 24/25 MS: 1 ShuffleBytes- 00:08:34.367 [2024-11-29 05:33:45.493296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.367 [2024-11-29 05:33:45.493324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.493396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.367 [2024-11-29 05:33:45.493412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.493464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.367 [2024-11-29 05:33:45.493479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.493531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.367 [2024-11-29 05:33:45.493549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.493605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:34.367 [2024-11-29 05:33:45.493620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:34.367 #34 NEW cov: 11806 ft: 13571 corp: 12/260b lim: 25 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:34.367 [2024-11-29 05:33:45.533449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.367 [2024-11-29 05:33:45.533477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.533530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.367 [2024-11-29 05:33:45.533545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.533603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.367 [2024-11-29 05:33:45.533620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.533673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.367 [2024-11-29 05:33:45.533688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.533745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:34.367 [2024-11-29 05:33:45.533760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:34.367 #40 NEW cov: 11806 ft: 13577 corp: 13/285b lim: 25 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 CopyPart- 00:08:34.367 [2024-11-29 05:33:45.573415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.367 [2024-11-29 05:33:45.573443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.573492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.367 [2024-11-29 05:33:45.573509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.573564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.367 [2024-11-29 05:33:45.573580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.573636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.367 [2024-11-29 05:33:45.573651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.367 #41 NEW cov: 11806 ft: 13656 corp: 14/308b lim: 25 exec/s: 0 rss: 68Mb L: 23/25 MS: 1 CMP- DE: "\003\000\000\000"- 00:08:34.367 [2024-11-29 05:33:45.613648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.367 [2024-11-29 05:33:45.613676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.613724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.367 [2024-11-29 05:33:45.613737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.613791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.367 [2024-11-29 05:33:45.613807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.613859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.367 [2024-11-29 05:33:45.613874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.613925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:34.367 [2024-11-29 05:33:45.613939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:34.367 #42 NEW cov: 11806 ft: 13690 corp: 15/333b lim: 25 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:34.367 [2024-11-29 05:33:45.653489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.367 [2024-11-29 05:33:45.653517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.367 [2024-11-29 05:33:45.653559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.367 [2024-11-29 05:33:45.653575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.627 #43 NEW cov: 11806 ft: 13971 corp: 16/346b lim: 25 exec/s: 0 rss: 68Mb L: 13/25 MS: 1 EraseBytes- 00:08:34.627 [2024-11-29 05:33:45.693899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.627 [2024-11-29 05:33:45.693927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.693995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.628 [2024-11-29 05:33:45.694011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.694063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.628 [2024-11-29 05:33:45.694078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.694129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.628 [2024-11-29 05:33:45.694144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.694201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:34.628 [2024-11-29 05:33:45.694217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:34.628 #44 NEW cov: 11806 ft: 14002 corp: 17/371b lim: 25 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 ChangeByte- 00:08:34.628 [2024-11-29 05:33:45.733927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.628 [2024-11-29 05:33:45.733954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.733999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.628 [2024-11-29 05:33:45.734014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.734070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.628 [2024-11-29 05:33:45.734085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.734141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.628 [2024-11-29 05:33:45.734156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.628 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:34.628 #45 NEW cov: 11829 ft: 14046 corp: 18/395b lim: 25 exec/s: 0 rss: 68Mb L: 24/25 MS: 1 CopyPart- 00:08:34.628 [2024-11-29 05:33:45.774020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.628 [2024-11-29 05:33:45.774047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.774096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.628 [2024-11-29 05:33:45.774111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.774163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.628 [2024-11-29 05:33:45.774179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.774233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.628 [2024-11-29 05:33:45.774248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.628 #46 NEW cov: 11829 ft: 14072 corp: 19/418b lim: 25 exec/s: 0 rss: 68Mb L: 23/25 MS: 1 CMP- DE: "\373\377\377\377"- 00:08:34.628 [2024-11-29 05:33:45.804091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.628 [2024-11-29 05:33:45.804118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.804167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.628 [2024-11-29 05:33:45.804183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.804238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.628 [2024-11-29 05:33:45.804253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.804306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.628 [2024-11-29 05:33:45.804321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.628 #47 NEW cov: 11829 ft: 14120 corp: 20/442b lim: 25 exec/s: 47 rss: 68Mb L: 24/25 MS: 1 ChangeByte- 00:08:34.628 [2024-11-29 05:33:45.843901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.628 [2024-11-29 05:33:45.843928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.628 #48 NEW cov: 11829 ft: 14563 corp: 21/447b lim: 25 exec/s: 48 rss: 68Mb L: 5/25 MS: 1 PersAutoDict- DE: "\003\000\000\000"- 00:08:34.628 [2024-11-29 05:33:45.884231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.628 [2024-11-29 05:33:45.884258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.884324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.628 [2024-11-29 05:33:45.884340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.884399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.628 [2024-11-29 05:33:45.884414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.628 #49 NEW cov: 11829 ft: 14570 corp: 22/464b lim: 25 exec/s: 49 rss: 68Mb L: 17/25 MS: 1 PersAutoDict- DE: "\003\000\000\000"- 00:08:34.628 [2024-11-29 05:33:45.924493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.628 [2024-11-29 05:33:45.924520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.924572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.628 [2024-11-29 05:33:45.924588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.924647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.628 [2024-11-29 05:33:45.924663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.628 [2024-11-29 05:33:45.924717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.628 [2024-11-29 05:33:45.924732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.888 #50 NEW cov: 11829 ft: 14581 corp: 23/487b lim: 25 exec/s: 50 rss: 68Mb L: 23/25 MS: 1 ChangeByte- 00:08:34.888 [2024-11-29 05:33:45.964329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.888 [2024-11-29 05:33:45.964355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.888 [2024-11-29 05:33:45.964394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.888 [2024-11-29 05:33:45.964409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.888 #51 NEW cov: 11829 ft: 14675 corp: 24/500b lim: 25 exec/s: 51 rss: 68Mb L: 13/25 MS: 1 ChangeBit- 00:08:34.888 [2024-11-29 05:33:46.004562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.888 [2024-11-29 05:33:46.004590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.888 [2024-11-29 05:33:46.004653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.888 [2024-11-29 05:33:46.004670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.888 [2024-11-29 05:33:46.004725] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.888 [2024-11-29 05:33:46.004741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.888 #52 NEW cov: 11829 ft: 14689 corp: 25/519b lim: 25 exec/s: 52 rss: 68Mb L: 19/25 MS: 1 CopyPart- 00:08:34.888 [2024-11-29 05:33:46.044956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.888 [2024-11-29 05:33:46.044982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.888 [2024-11-29 05:33:46.045039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.888 [2024-11-29 05:33:46.045052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.045107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.889 [2024-11-29 05:33:46.045125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.045177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.889 [2024-11-29 05:33:46.045191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.045243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:34.889 [2024-11-29 05:33:46.045258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:34.889 #53 NEW cov: 11829 ft: 14700 corp: 26/544b lim: 25 exec/s: 53 rss: 68Mb L: 25/25 MS: 1 ChangeBit- 00:08:34.889 [2024-11-29 05:33:46.084814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.889 [2024-11-29 05:33:46.084841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.084899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.889 [2024-11-29 05:33:46.084915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.084970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.889 [2024-11-29 05:33:46.084986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.889 #54 NEW cov: 11829 ft: 14718 corp: 27/562b lim: 25 exec/s: 54 rss: 68Mb L: 18/25 MS: 1 EraseBytes- 00:08:34.889 [2024-11-29 05:33:46.125042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.889 [2024-11-29 05:33:46.125069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.125119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.889 [2024-11-29 05:33:46.125134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.125187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.889 [2024-11-29 05:33:46.125203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.125259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.889 [2024-11-29 05:33:46.125274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.889 #55 NEW cov: 11829 ft: 14739 corp: 28/586b lim: 25 exec/s: 55 rss: 69Mb L: 24/25 MS: 1 ChangeBit- 00:08:34.889 [2024-11-29 05:33:46.165279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:34.889 [2024-11-29 05:33:46.165306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.165363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:34.889 [2024-11-29 05:33:46.165376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.165431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:34.889 [2024-11-29 05:33:46.165446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.165500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:34.889 [2024-11-29 05:33:46.165518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.889 [2024-11-29 05:33:46.165571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:34.889 [2024-11-29 05:33:46.165586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:34.889 #56 NEW cov: 11829 ft: 14745 corp: 29/611b lim: 25 exec/s: 56 rss: 69Mb L: 25/25 MS: 1 CrossOver- 00:08:35.149 [2024-11-29 05:33:46.205284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.149 [2024-11-29 05:33:46.205311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.205361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.149 [2024-11-29 05:33:46.205377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.205432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.149 [2024-11-29 05:33:46.205448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.205503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.149 [2024-11-29 05:33:46.205517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.149 #57 NEW cov: 11829 ft: 14784 corp: 30/635b lim: 25 exec/s: 57 rss: 69Mb L: 24/25 MS: 1 ChangeByte- 00:08:35.149 [2024-11-29 05:33:46.245381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.149 [2024-11-29 05:33:46.245408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.245456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.149 [2024-11-29 05:33:46.245472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.245519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.149 [2024-11-29 05:33:46.245549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.245611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.149 [2024-11-29 05:33:46.245626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.149 #58 NEW cov: 11829 ft: 14791 corp: 31/659b lim: 25 exec/s: 58 rss: 69Mb L: 24/25 MS: 1 ShuffleBytes- 00:08:35.149 [2024-11-29 05:33:46.285353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.149 [2024-11-29 05:33:46.285381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.285445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.149 [2024-11-29 05:33:46.285461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.285517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.149 [2024-11-29 05:33:46.285532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.149 #59 NEW cov: 11829 ft: 14853 corp: 32/678b lim: 25 exec/s: 59 rss: 69Mb L: 19/25 MS: 1 ChangeBinInt- 00:08:35.149 [2024-11-29 05:33:46.325482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.149 [2024-11-29 05:33:46.325509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.325572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.149 [2024-11-29 05:33:46.325588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.325648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.149 [2024-11-29 05:33:46.325664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.149 #60 NEW cov: 11829 ft: 14858 corp: 33/694b lim: 25 exec/s: 60 rss: 69Mb L: 16/25 MS: 1 EraseBytes- 00:08:35.149 [2024-11-29 05:33:46.365928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.149 [2024-11-29 05:33:46.365955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.366003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.149 [2024-11-29 05:33:46.366018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.366073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.149 [2024-11-29 05:33:46.366089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.366144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.149 [2024-11-29 05:33:46.366159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.366213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.149 [2024-11-29 05:33:46.366227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.149 #61 NEW cov: 11829 ft: 14868 corp: 34/719b lim: 25 exec/s: 61 rss: 69Mb L: 25/25 MS: 1 CopyPart- 00:08:35.149 [2024-11-29 05:33:46.405739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.149 [2024-11-29 05:33:46.405766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.405810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.149 [2024-11-29 05:33:46.405825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.405879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.149 [2024-11-29 05:33:46.405894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.149 #65 NEW cov: 11829 ft: 14901 corp: 35/735b lim: 25 exec/s: 65 rss: 69Mb L: 16/25 MS: 4 CrossOver-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:08:35.149 [2024-11-29 05:33:46.446005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.149 [2024-11-29 05:33:46.446032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.446084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.149 [2024-11-29 05:33:46.446099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.446156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.149 [2024-11-29 05:33:46.446171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.149 [2024-11-29 05:33:46.446226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.149 [2024-11-29 05:33:46.446242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.408 #66 NEW cov: 11829 ft: 14911 corp: 36/759b lim: 25 exec/s: 66 rss: 69Mb L: 24/25 MS: 1 InsertByte- 00:08:35.408 [2024-11-29 05:33:46.485974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.408 [2024-11-29 05:33:46.486001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.486038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.408 [2024-11-29 05:33:46.486053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.486107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.408 [2024-11-29 05:33:46.486122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.408 #67 NEW cov: 11829 ft: 14914 corp: 37/774b lim: 25 exec/s: 67 rss: 69Mb L: 15/25 MS: 1 EraseBytes- 00:08:35.408 [2024-11-29 05:33:46.526187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.408 [2024-11-29 05:33:46.526213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.526277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.408 [2024-11-29 05:33:46.526293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.526345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.408 [2024-11-29 05:33:46.526361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.526414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.408 [2024-11-29 05:33:46.526429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.408 #68 NEW cov: 11829 ft: 14993 corp: 38/798b lim: 25 exec/s: 68 rss: 69Mb L: 24/25 MS: 1 ShuffleBytes- 00:08:35.408 [2024-11-29 05:33:46.566178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.408 [2024-11-29 05:33:46.566204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.566255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.408 [2024-11-29 05:33:46.566270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.566322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.408 [2024-11-29 05:33:46.566338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.408 #69 NEW cov: 11829 ft: 15027 corp: 39/815b lim: 25 exec/s: 69 rss: 69Mb L: 17/25 MS: 1 ShuffleBytes- 00:08:35.408 [2024-11-29 05:33:46.606441] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.408 [2024-11-29 05:33:46.606470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.606507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.408 [2024-11-29 05:33:46.606522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.606573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.408 [2024-11-29 05:33:46.606586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.606646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.408 [2024-11-29 05:33:46.606662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.408 #70 NEW cov: 11829 ft: 15054 corp: 40/839b lim: 25 exec/s: 70 rss: 69Mb L: 24/25 MS: 1 ChangeBit- 00:08:35.408 [2024-11-29 05:33:46.646662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.408 [2024-11-29 05:33:46.646689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.646747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.408 [2024-11-29 05:33:46.646761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.646815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.408 [2024-11-29 05:33:46.646830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.646883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.408 [2024-11-29 05:33:46.646898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.408 [2024-11-29 05:33:46.646953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.409 [2024-11-29 05:33:46.646968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.409 #71 NEW cov: 11829 ft: 15077 corp: 41/864b lim: 25 exec/s: 71 rss: 69Mb L: 25/25 MS: 1 PersAutoDict- DE: "\373\377\377\377"- 00:08:35.409 [2024-11-29 05:33:46.686647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.409 [2024-11-29 05:33:46.686674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.409 [2024-11-29 05:33:46.686724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.409 [2024-11-29 05:33:46.686739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.409 [2024-11-29 05:33:46.686791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.409 [2024-11-29 05:33:46.686806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.409 [2024-11-29 05:33:46.686858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.409 [2024-11-29 05:33:46.686873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.668 #72 NEW cov: 11829 ft: 15168 corp: 42/888b lim: 25 exec/s: 72 rss: 69Mb L: 24/25 MS: 1 ChangeBit- 00:08:35.668 [2024-11-29 05:33:46.726900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.668 [2024-11-29 05:33:46.726927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.668 [2024-11-29 05:33:46.726980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.668 [2024-11-29 05:33:46.726996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.668 [2024-11-29 05:33:46.727048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.668 [2024-11-29 05:33:46.727063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.668 [2024-11-29 05:33:46.727114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.668 [2024-11-29 05:33:46.727129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.668 [2024-11-29 05:33:46.727182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.668 [2024-11-29 05:33:46.727195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.668 #73 NEW cov: 11829 ft: 15178 corp: 43/913b lim: 25 exec/s: 73 rss: 69Mb L: 25/25 MS: 1 CrossOver- 00:08:35.668 [2024-11-29 05:33:46.766906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.668 [2024-11-29 05:33:46.766934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.668 [2024-11-29 05:33:46.766982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.668 [2024-11-29 05:33:46.766997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.668 [2024-11-29 05:33:46.767048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.668 [2024-11-29 05:33:46.767063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.668 [2024-11-29 05:33:46.767114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.668 [2024-11-29 05:33:46.767129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.668 #74 NEW cov: 11829 ft: 15190 corp: 44/936b lim: 25 exec/s: 74 rss: 69Mb L: 23/25 MS: 1 PersAutoDict- DE: "\003\000\000\000"- 00:08:35.668 [2024-11-29 05:33:46.806888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.668 [2024-11-29 05:33:46.806914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.668 [2024-11-29 05:33:46.806969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.668 [2024-11-29 05:33:46.806985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.668 [2024-11-29 05:33:46.807038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.668 [2024-11-29 05:33:46.807052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.668 #75 NEW cov: 11829 ft: 15249 corp: 45/953b lim: 25 exec/s: 37 rss: 69Mb L: 17/25 MS: 1 ChangeByte- 00:08:35.668 #75 DONE cov: 11829 ft: 15249 corp: 45/953b lim: 25 exec/s: 37 rss: 69Mb 00:08:35.668 ###### Recommended dictionary. ###### 00:08:35.668 "\003\000\000\000" # Uses: 3 00:08:35.668 "\373\377\377\377" # Uses: 1 00:08:35.668 ###### End of recommended dictionary. ###### 00:08:35.668 Done 75 runs in 2 second(s) 00:08:35.668 05:33:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:35.668 05:33:46 -- ../common.sh@72 -- # (( i++ )) 00:08:35.668 05:33:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.668 05:33:46 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:35.668 05:33:46 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:35.668 05:33:46 -- nvmf/run.sh@24 -- # local timen=1 00:08:35.668 05:33:46 -- nvmf/run.sh@25 -- # local core=0x1 00:08:35.668 05:33:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:35.668 05:33:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:35.668 05:33:46 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:35.668 05:33:46 -- nvmf/run.sh@29 -- # port=4424 00:08:35.668 05:33:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:35.668 05:33:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:35.668 05:33:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:35.668 05:33:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:35.928 [2024-11-29 05:33:46.993194] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:35.928 [2024-11-29 05:33:46.993292] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2224219 ] 00:08:35.928 EAL: No free 2048 kB hugepages reported on node 1 00:08:36.188 [2024-11-29 05:33:47.244525] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.188 [2024-11-29 05:33:47.272146] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:36.188 [2024-11-29 05:33:47.272277] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.188 [2024-11-29 05:33:47.323797] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:36.188 [2024-11-29 05:33:47.340159] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:36.188 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.188 INFO: Seed: 3489633914 00:08:36.188 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:36.188 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:36.188 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:36.188 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.188 #2 INITED exec/s: 0 rss: 60Mb 00:08:36.188 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.188 This may also happen if the target rejected all inputs we tried so far 00:08:36.188 [2024-11-29 05:33:47.417043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.188 [2024-11-29 05:33:47.417083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.188 [2024-11-29 05:33:47.417154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.188 [2024-11-29 05:33:47.417173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.188 [2024-11-29 05:33:47.417240] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.188 [2024-11-29 05:33:47.417259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.447 NEW_FUNC[1/670]: 0x47cdc8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:36.447 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:36.447 #4 NEW cov: 11645 ft: 11675 corp: 2/63b lim: 100 exec/s: 0 rss: 67Mb L: 62/62 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:36.447 [2024-11-29 05:33:47.747346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.447 [2024-11-29 05:33:47.747382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.447 [2024-11-29 05:33:47.747502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:58613 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.447 [2024-11-29 05:33:47.747525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.447 [2024-11-29 05:33:47.747651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.447 [2024-11-29 05:33:47.747672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.707 NEW_FUNC[1/2]: 0x19613b8 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:528 00:08:36.707 NEW_FUNC[2/2]: 0x1966848 in _reactor_run /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:894 00:08:36.707 #5 NEW cov: 11787 ft: 12415 corp: 3/125b lim: 100 exec/s: 0 rss: 67Mb L: 62/62 MS: 1 ChangeBit- 00:08:36.707 [2024-11-29 05:33:47.796893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.796928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.707 #6 NEW cov: 11793 ft: 13571 corp: 4/155b lim: 100 exec/s: 0 rss: 67Mb L: 30/62 MS: 1 InsertRepeatedBytes- 00:08:36.707 [2024-11-29 05:33:47.837526] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.837561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.707 [2024-11-29 05:33:47.837690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.837714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.707 [2024-11-29 05:33:47.837839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.837859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.707 #7 NEW cov: 11878 ft: 13896 corp: 5/223b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 CopyPart- 00:08:36.707 [2024-11-29 05:33:47.877438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.877469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.707 [2024-11-29 05:33:47.877583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.877604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.707 [2024-11-29 05:33:47.877722] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.877743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.707 #8 NEW cov: 11878 ft: 14018 corp: 6/291b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 ChangeBit- 00:08:36.707 [2024-11-29 05:33:47.918000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.918029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.707 [2024-11-29 05:33:47.918106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.918127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.707 [2024-11-29 05:33:47.918241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.918263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.707 [2024-11-29 05:33:47.918378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.918398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.707 #9 NEW cov: 11878 ft: 14429 corp: 7/379b lim: 100 exec/s: 0 rss: 68Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:08:36.707 [2024-11-29 05:33:47.977928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.977960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.707 [2024-11-29 05:33:47.978078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.978098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.707 [2024-11-29 05:33:47.978224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.707 [2024-11-29 05:33:47.978244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.707 #10 NEW cov: 11878 ft: 14584 corp: 8/441b lim: 100 exec/s: 0 rss: 68Mb L: 62/88 MS: 1 ChangeByte- 00:08:36.967 [2024-11-29 05:33:48.017607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.967 [2024-11-29 05:33:48.017638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.967 [2024-11-29 05:33:48.017734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.967 [2024-11-29 05:33:48.017756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.967 [2024-11-29 05:33:48.017884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.967 [2024-11-29 05:33:48.017906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.967 #11 NEW cov: 11878 ft: 14631 corp: 9/509b lim: 100 exec/s: 0 rss: 68Mb L: 68/88 MS: 1 ChangeBinInt- 00:08:36.967 [2024-11-29 05:33:48.057656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.967 [2024-11-29 05:33:48.057689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.967 [2024-11-29 05:33:48.057769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.967 [2024-11-29 05:33:48.057791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.967 [2024-11-29 05:33:48.057914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.967 [2024-11-29 05:33:48.057934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.967 #12 NEW cov: 11878 ft: 14708 corp: 10/577b lim: 100 exec/s: 0 rss: 68Mb L: 68/88 MS: 1 ChangeByte- 00:08:36.967 [2024-11-29 05:33:48.098190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.967 [2024-11-29 05:33:48.098225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.967 [2024-11-29 05:33:48.098325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.967 [2024-11-29 05:33:48.098345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.967 [2024-11-29 05:33:48.098475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.967 [2024-11-29 05:33:48.098494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.967 #13 NEW cov: 11878 ft: 14736 corp: 11/646b lim: 100 exec/s: 0 rss: 68Mb L: 69/88 MS: 1 CrossOver- 00:08:36.967 [2024-11-29 05:33:48.138400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.968 [2024-11-29 05:33:48.138430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.968 [2024-11-29 05:33:48.138550] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.968 [2024-11-29 05:33:48.138573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.968 [2024-11-29 05:33:48.138709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:2828 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.968 [2024-11-29 05:33:48.138733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.968 #14 NEW cov: 11878 ft: 14813 corp: 12/708b lim: 100 exec/s: 0 rss: 68Mb L: 62/88 MS: 1 ChangeBinInt- 00:08:36.968 [2024-11-29 05:33:48.178479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.968 [2024-11-29 05:33:48.178510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.968 [2024-11-29 05:33:48.178631] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.968 [2024-11-29 05:33:48.178667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.968 [2024-11-29 05:33:48.178793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.968 [2024-11-29 05:33:48.178814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.968 #15 NEW cov: 11878 ft: 14822 corp: 13/770b lim: 100 exec/s: 0 rss: 68Mb L: 62/88 MS: 1 ShuffleBytes- 00:08:36.968 [2024-11-29 05:33:48.218764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.968 [2024-11-29 05:33:48.218807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.968 [2024-11-29 05:33:48.218923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.968 [2024-11-29 05:33:48.218946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.968 [2024-11-29 05:33:48.219067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.968 [2024-11-29 05:33:48.219088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.968 [2024-11-29 05:33:48.219211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.968 [2024-11-29 05:33:48.219232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.968 #16 NEW cov: 11878 ft: 14905 corp: 14/860b lim: 100 exec/s: 0 rss: 68Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:36.968 [2024-11-29 05:33:48.268808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.968 [2024-11-29 05:33:48.268841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.227 [2024-11-29 05:33:48.268953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.227 [2024-11-29 05:33:48.268980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.227 [2024-11-29 05:33:48.269100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:2828 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.227 [2024-11-29 05:33:48.269124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.227 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:37.227 #17 NEW cov: 11901 ft: 14946 corp: 15/923b lim: 100 exec/s: 0 rss: 68Mb L: 63/90 MS: 1 InsertByte- 00:08:37.228 [2024-11-29 05:33:48.308854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.308887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.228 [2024-11-29 05:33:48.309009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:58370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.309032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.228 [2024-11-29 05:33:48.309159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.309182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.228 #18 NEW cov: 11901 ft: 14961 corp: 16/989b lim: 100 exec/s: 0 rss: 68Mb L: 66/90 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:37.228 [2024-11-29 05:33:48.348601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.348632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.228 [2024-11-29 05:33:48.348698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.348722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.228 [2024-11-29 05:33:48.348846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4109631488 len:2828 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.348868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.228 #19 NEW cov: 11901 ft: 14972 corp: 17/1051b lim: 100 exec/s: 0 rss: 68Mb L: 62/90 MS: 1 ChangeBinInt- 00:08:37.228 [2024-11-29 05:33:48.388997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.389031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.228 [2024-11-29 05:33:48.389145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.389167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.228 [2024-11-29 05:33:48.389297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17646498572863337716 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.389319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.228 #20 NEW cov: 11901 ft: 14979 corp: 18/1113b lim: 100 exec/s: 20 rss: 68Mb L: 62/90 MS: 1 ChangeBit- 00:08:37.228 [2024-11-29 05:33:48.429326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.429361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.228 [2024-11-29 05:33:48.429482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.429507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.228 [2024-11-29 05:33:48.429643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.429664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.228 #21 NEW cov: 11901 ft: 14983 corp: 19/1181b lim: 100 exec/s: 21 rss: 68Mb L: 68/90 MS: 1 ChangeBinInt- 00:08:37.228 [2024-11-29 05:33:48.469439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.469476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.228 [2024-11-29 05:33:48.469604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:54517 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.469628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.228 [2024-11-29 05:33:48.469755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4109631488 len:2828 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.469779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.228 #22 NEW cov: 11901 ft: 14998 corp: 20/1243b lim: 100 exec/s: 22 rss: 68Mb L: 62/90 MS: 1 ChangeBit- 00:08:37.228 [2024-11-29 05:33:48.509068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.509100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.228 [2024-11-29 05:33:48.509222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.228 [2024-11-29 05:33:48.509248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.488 #23 NEW cov: 11901 ft: 15302 corp: 21/1294b lim: 100 exec/s: 23 rss: 68Mb L: 51/90 MS: 1 EraseBytes- 00:08:37.488 [2024-11-29 05:33:48.559658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.559691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.488 [2024-11-29 05:33:48.559807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708148 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.559830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.488 [2024-11-29 05:33:48.559958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.559982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.488 #24 NEW cov: 11901 ft: 15338 corp: 22/1362b lim: 100 exec/s: 24 rss: 68Mb L: 68/90 MS: 1 ChangeBit- 00:08:37.488 [2024-11-29 05:33:48.599812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.599850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.488 [2024-11-29 05:33:48.599955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.599979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.488 [2024-11-29 05:33:48.600101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.600121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.488 #25 NEW cov: 11901 ft: 15346 corp: 23/1430b lim: 100 exec/s: 25 rss: 68Mb L: 68/90 MS: 1 CrossOver- 00:08:37.488 [2024-11-29 05:33:48.638953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002168564839668 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.638986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.488 #26 NEW cov: 11901 ft: 15386 corp: 24/1458b lim: 100 exec/s: 26 rss: 68Mb L: 28/90 MS: 1 CrossOver- 00:08:37.488 [2024-11-29 05:33:48.690081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.690114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.488 [2024-11-29 05:33:48.690231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.690257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.488 [2024-11-29 05:33:48.690382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.690405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.488 #27 NEW cov: 11901 ft: 15438 corp: 25/1520b lim: 100 exec/s: 27 rss: 68Mb L: 62/90 MS: 1 ChangeByte- 00:08:37.488 [2024-11-29 05:33:48.729651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17648058638129231092 len:44948 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.729684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.488 [2024-11-29 05:33:48.729804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.729826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.488 [2024-11-29 05:33:48.729950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.729973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.488 #28 NEW cov: 11901 ft: 15448 corp: 26/1588b lim: 100 exec/s: 28 rss: 68Mb L: 68/90 MS: 1 CMP- DE: "\352\177\323\320\235\257\223\000"- 00:08:37.488 [2024-11-29 05:33:48.769573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002168564839668 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.488 [2024-11-29 05:33:48.769606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.748 #29 NEW cov: 11901 ft: 15472 corp: 27/1616b lim: 100 exec/s: 29 rss: 68Mb L: 28/90 MS: 1 CrossOver- 00:08:37.748 [2024-11-29 05:33:48.820374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:48.820409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.748 [2024-11-29 05:33:48.820535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:48.820556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.748 [2024-11-29 05:33:48.820673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:2805 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:48.820699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.748 #30 NEW cov: 11901 ft: 15483 corp: 28/1678b lim: 100 exec/s: 30 rss: 69Mb L: 62/90 MS: 1 CrossOver- 00:08:37.748 [2024-11-29 05:33:48.880824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:48.880859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.748 [2024-11-29 05:33:48.880968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708148 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:48.880990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.748 [2024-11-29 05:33:48.881113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:48.881136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.748 [2024-11-29 05:33:48.881264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:48.881285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.748 #31 NEW cov: 11901 ft: 15513 corp: 29/1769b lim: 100 exec/s: 31 rss: 69Mb L: 91/91 MS: 1 CrossOver- 00:08:37.748 [2024-11-29 05:33:48.940795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:48.940831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.748 [2024-11-29 05:33:48.940952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:54517 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:48.940974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.748 [2024-11-29 05:33:48.941099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4109631488 len:2828 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:48.941122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.748 #32 NEW cov: 11901 ft: 15532 corp: 30/1831b lim: 100 exec/s: 32 rss: 69Mb L: 62/91 MS: 1 ChangeByte- 00:08:37.748 [2024-11-29 05:33:49.001139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:49.001176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.748 [2024-11-29 05:33:49.001278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:49.001299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.748 [2024-11-29 05:33:49.001418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:49.001439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.748 [2024-11-29 05:33:49.001571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.748 [2024-11-29 05:33:49.001594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.748 #33 NEW cov: 11901 ft: 15547 corp: 31/1921b lim: 100 exec/s: 33 rss: 69Mb L: 90/91 MS: 1 ShuffleBytes- 00:08:38.007 [2024-11-29 05:33:49.050911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.050945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.051009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744069414649855 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.051032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.051144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.051168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.051296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.051317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.008 #34 NEW cov: 11901 ft: 15567 corp: 32/2013b lim: 100 exec/s: 34 rss: 69Mb L: 92/92 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:38.008 [2024-11-29 05:33:49.111352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.111385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.111493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17649876272583865588 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.111516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.111637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.111661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.008 #35 NEW cov: 11901 ft: 15580 corp: 33/2081b lim: 100 exec/s: 35 rss: 69Mb L: 68/92 MS: 1 ChangeBit- 00:08:38.008 [2024-11-29 05:33:49.151037] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17648111139809457396 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.151071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.151189] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.151212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.151342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.151362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.008 #36 NEW cov: 11901 ft: 15590 corp: 34/2145b lim: 100 exec/s: 36 rss: 69Mb L: 64/92 MS: 1 EraseBytes- 00:08:38.008 [2024-11-29 05:33:49.191346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.191380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.191497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.191521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.191645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.191664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.191790] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.191811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.008 #37 NEW cov: 11901 ft: 15618 corp: 35/2235b lim: 100 exec/s: 37 rss: 69Mb L: 90/92 MS: 1 CopyPart- 00:08:38.008 [2024-11-29 05:33:49.231593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17648111139809457396 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.231629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.231712] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.231734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.008 [2024-11-29 05:33:49.231852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17650860335490725108 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.231872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.008 #38 NEW cov: 11901 ft: 15627 corp: 36/2299b lim: 100 exec/s: 38 rss: 69Mb L: 64/92 MS: 1 ChangeByte- 00:08:38.008 [2024-11-29 05:33:49.271233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.008 [2024-11-29 05:33:49.271259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.008 #39 NEW cov: 11901 ft: 15637 corp: 37/2338b lim: 100 exec/s: 39 rss: 69Mb L: 39/92 MS: 1 EraseBytes- 00:08:38.269 [2024-11-29 05:33:49.311912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.269 [2024-11-29 05:33:49.311945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.269 [2024-11-29 05:33:49.312033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.269 [2024-11-29 05:33:49.312056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.269 [2024-11-29 05:33:49.312181] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.269 [2024-11-29 05:33:49.312205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.269 #40 NEW cov: 11901 ft: 15708 corp: 38/2406b lim: 100 exec/s: 40 rss: 69Mb L: 68/92 MS: 1 ShuffleBytes- 00:08:38.269 [2024-11-29 05:33:49.351999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.269 [2024-11-29 05:33:49.352031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.269 [2024-11-29 05:33:49.352145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.269 [2024-11-29 05:33:49.352163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.269 [2024-11-29 05:33:49.352284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.269 [2024-11-29 05:33:49.352303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.269 #41 NEW cov: 11901 ft: 15735 corp: 39/2469b lim: 100 exec/s: 41 rss: 69Mb L: 63/92 MS: 1 InsertByte- 00:08:38.269 [2024-11-29 05:33:49.392118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17651002172490708212 len:62693 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.269 [2024-11-29 05:33:49.392147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.269 [2024-11-29 05:33:49.392261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.269 [2024-11-29 05:33:49.392286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.269 [2024-11-29 05:33:49.392409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.269 [2024-11-29 05:33:49.392432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.269 #42 NEW cov: 11901 ft: 15747 corp: 40/2538b lim: 100 exec/s: 21 rss: 70Mb L: 69/92 MS: 1 ShuffleBytes- 00:08:38.269 #42 DONE cov: 11901 ft: 15747 corp: 40/2538b lim: 100 exec/s: 21 rss: 70Mb 00:08:38.269 ###### Recommended dictionary. ###### 00:08:38.269 "\001\000\000\000" # Uses: 1 00:08:38.269 "\352\177\323\320\235\257\223\000" # Uses: 0 00:08:38.269 ###### End of recommended dictionary. ###### 00:08:38.269 Done 42 runs in 2 second(s) 00:08:38.269 05:33:49 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:38.269 05:33:49 -- ../common.sh@72 -- # (( i++ )) 00:08:38.269 05:33:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.269 05:33:49 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:38.269 00:08:38.269 real 1m3.788s 00:08:38.269 user 1m39.368s 00:08:38.269 sys 0m8.138s 00:08:38.269 05:33:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:38.269 05:33:49 -- common/autotest_common.sh@10 -- # set +x 00:08:38.269 ************************************ 00:08:38.269 END TEST nvmf_fuzz 00:08:38.269 ************************************ 00:08:38.530 05:33:49 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:38.530 05:33:49 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:38.530 05:33:49 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:38.530 05:33:49 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:38.530 05:33:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:38.530 05:33:49 -- common/autotest_common.sh@10 -- # set +x 00:08:38.530 ************************************ 00:08:38.530 START TEST vfio_fuzz 00:08:38.530 ************************************ 00:08:38.530 05:33:49 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:38.530 * Looking for test storage... 00:08:38.531 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:38.531 05:33:49 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:38.531 05:33:49 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:38.531 05:33:49 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:38.531 05:33:49 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:38.531 05:33:49 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:38.531 05:33:49 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:38.531 05:33:49 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:38.531 05:33:49 -- scripts/common.sh@335 -- # IFS=.-: 00:08:38.531 05:33:49 -- scripts/common.sh@335 -- # read -ra ver1 00:08:38.531 05:33:49 -- scripts/common.sh@336 -- # IFS=.-: 00:08:38.531 05:33:49 -- scripts/common.sh@336 -- # read -ra ver2 00:08:38.531 05:33:49 -- scripts/common.sh@337 -- # local 'op=<' 00:08:38.531 05:33:49 -- scripts/common.sh@339 -- # ver1_l=2 00:08:38.531 05:33:49 -- scripts/common.sh@340 -- # ver2_l=1 00:08:38.531 05:33:49 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:38.531 05:33:49 -- scripts/common.sh@343 -- # case "$op" in 00:08:38.531 05:33:49 -- scripts/common.sh@344 -- # : 1 00:08:38.531 05:33:49 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:38.531 05:33:49 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:38.531 05:33:49 -- scripts/common.sh@364 -- # decimal 1 00:08:38.531 05:33:49 -- scripts/common.sh@352 -- # local d=1 00:08:38.531 05:33:49 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:38.531 05:33:49 -- scripts/common.sh@354 -- # echo 1 00:08:38.531 05:33:49 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:38.531 05:33:49 -- scripts/common.sh@365 -- # decimal 2 00:08:38.531 05:33:49 -- scripts/common.sh@352 -- # local d=2 00:08:38.531 05:33:49 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:38.531 05:33:49 -- scripts/common.sh@354 -- # echo 2 00:08:38.531 05:33:49 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:38.531 05:33:49 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:38.531 05:33:49 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:38.531 05:33:49 -- scripts/common.sh@367 -- # return 0 00:08:38.531 05:33:49 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:38.531 05:33:49 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:38.531 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:38.531 --rc genhtml_branch_coverage=1 00:08:38.531 --rc genhtml_function_coverage=1 00:08:38.531 --rc genhtml_legend=1 00:08:38.531 --rc geninfo_all_blocks=1 00:08:38.531 --rc geninfo_unexecuted_blocks=1 00:08:38.531 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:38.531 ' 00:08:38.531 05:33:49 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:38.531 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:38.531 --rc genhtml_branch_coverage=1 00:08:38.531 --rc genhtml_function_coverage=1 00:08:38.531 --rc genhtml_legend=1 00:08:38.531 --rc geninfo_all_blocks=1 00:08:38.531 --rc geninfo_unexecuted_blocks=1 00:08:38.531 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:38.531 ' 00:08:38.531 05:33:49 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:38.531 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:38.531 --rc genhtml_branch_coverage=1 00:08:38.531 --rc genhtml_function_coverage=1 00:08:38.531 --rc genhtml_legend=1 00:08:38.531 --rc geninfo_all_blocks=1 00:08:38.531 --rc geninfo_unexecuted_blocks=1 00:08:38.531 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:38.531 ' 00:08:38.531 05:33:49 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:38.531 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:38.531 --rc genhtml_branch_coverage=1 00:08:38.531 --rc genhtml_function_coverage=1 00:08:38.531 --rc genhtml_legend=1 00:08:38.531 --rc geninfo_all_blocks=1 00:08:38.531 --rc geninfo_unexecuted_blocks=1 00:08:38.531 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:38.531 ' 00:08:38.531 05:33:49 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:38.531 05:33:49 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:38.531 05:33:49 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:38.531 05:33:49 -- common/autotest_common.sh@34 -- # set -e 00:08:38.531 05:33:49 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:38.531 05:33:49 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:38.531 05:33:49 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:38.531 05:33:49 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:38.531 05:33:49 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:38.531 05:33:49 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:38.531 05:33:49 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:38.531 05:33:49 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:38.531 05:33:49 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:38.531 05:33:49 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:38.531 05:33:49 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:38.531 05:33:49 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:38.531 05:33:49 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:38.531 05:33:49 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:38.531 05:33:49 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:38.531 05:33:49 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:38.531 05:33:49 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:38.531 05:33:49 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:38.531 05:33:49 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:38.531 05:33:49 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:38.531 05:33:49 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:38.531 05:33:49 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:38.531 05:33:49 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:38.531 05:33:49 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:38.531 05:33:49 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:38.531 05:33:49 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:38.531 05:33:49 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:38.531 05:33:49 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:38.531 05:33:49 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:38.531 05:33:49 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:38.531 05:33:49 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:38.531 05:33:49 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:38.531 05:33:49 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:38.531 05:33:49 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:38.531 05:33:49 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:38.531 05:33:49 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:38.531 05:33:49 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:38.531 05:33:49 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:38.531 05:33:49 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:38.531 05:33:49 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:38.531 05:33:49 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:38.531 05:33:49 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:38.531 05:33:49 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:38.531 05:33:49 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:38.531 05:33:49 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:38.531 05:33:49 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:38.531 05:33:49 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:38.531 05:33:49 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:38.531 05:33:49 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:38.531 05:33:49 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:38.531 05:33:49 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:38.531 05:33:49 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:38.531 05:33:49 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:38.531 05:33:49 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:38.531 05:33:49 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:38.531 05:33:49 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:38.531 05:33:49 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:38.531 05:33:49 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:38.531 05:33:49 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:38.531 05:33:49 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:38.531 05:33:49 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:38.531 05:33:49 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:38.531 05:33:49 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:38.531 05:33:49 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:38.531 05:33:49 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:38.531 05:33:49 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:38.531 05:33:49 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:38.531 05:33:49 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:38.531 05:33:49 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:38.531 05:33:49 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:38.531 05:33:49 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:38.531 05:33:49 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:38.531 05:33:49 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:38.531 05:33:49 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:38.531 05:33:49 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:38.531 05:33:49 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:38.531 05:33:49 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:38.531 05:33:49 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:38.531 05:33:49 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:38.531 05:33:49 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:38.531 05:33:49 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:38.531 05:33:49 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:38.531 05:33:49 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:38.532 05:33:49 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:38.532 05:33:49 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:38.532 05:33:49 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:38.532 05:33:49 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:38.532 05:33:49 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:38.532 05:33:49 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:38.532 05:33:49 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:38.532 05:33:49 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:38.532 05:33:49 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:38.532 05:33:49 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:38.532 05:33:49 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:38.532 05:33:49 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:38.532 05:33:49 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:38.532 05:33:49 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:38.532 05:33:49 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:38.532 05:33:49 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:38.532 #define SPDK_CONFIG_H 00:08:38.532 #define SPDK_CONFIG_APPS 1 00:08:38.532 #define SPDK_CONFIG_ARCH native 00:08:38.532 #undef SPDK_CONFIG_ASAN 00:08:38.532 #undef SPDK_CONFIG_AVAHI 00:08:38.532 #undef SPDK_CONFIG_CET 00:08:38.532 #define SPDK_CONFIG_COVERAGE 1 00:08:38.532 #define SPDK_CONFIG_CROSS_PREFIX 00:08:38.532 #undef SPDK_CONFIG_CRYPTO 00:08:38.532 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:38.532 #undef SPDK_CONFIG_CUSTOMOCF 00:08:38.532 #undef SPDK_CONFIG_DAOS 00:08:38.532 #define SPDK_CONFIG_DAOS_DIR 00:08:38.532 #define SPDK_CONFIG_DEBUG 1 00:08:38.532 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:38.532 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:38.532 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:38.532 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:38.532 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:38.532 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:38.532 #define SPDK_CONFIG_EXAMPLES 1 00:08:38.532 #undef SPDK_CONFIG_FC 00:08:38.532 #define SPDK_CONFIG_FC_PATH 00:08:38.532 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:38.532 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:38.532 #undef SPDK_CONFIG_FUSE 00:08:38.532 #define SPDK_CONFIG_FUZZER 1 00:08:38.532 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:38.532 #undef SPDK_CONFIG_GOLANG 00:08:38.532 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:38.532 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:38.532 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:38.532 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:38.532 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:38.532 #define SPDK_CONFIG_IDXD 1 00:08:38.532 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:38.532 #undef SPDK_CONFIG_IPSEC_MB 00:08:38.532 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:38.532 #define SPDK_CONFIG_ISAL 1 00:08:38.532 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:38.532 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:38.532 #define SPDK_CONFIG_LIBDIR 00:08:38.532 #undef SPDK_CONFIG_LTO 00:08:38.532 #define SPDK_CONFIG_MAX_LCORES 00:08:38.532 #define SPDK_CONFIG_NVME_CUSE 1 00:08:38.532 #undef SPDK_CONFIG_OCF 00:08:38.532 #define SPDK_CONFIG_OCF_PATH 00:08:38.532 #define SPDK_CONFIG_OPENSSL_PATH 00:08:38.532 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:38.532 #undef SPDK_CONFIG_PGO_USE 00:08:38.532 #define SPDK_CONFIG_PREFIX /usr/local 00:08:38.532 #undef SPDK_CONFIG_RAID5F 00:08:38.532 #undef SPDK_CONFIG_RBD 00:08:38.532 #define SPDK_CONFIG_RDMA 1 00:08:38.532 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:38.532 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:38.532 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:38.532 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:38.532 #undef SPDK_CONFIG_SHARED 00:08:38.532 #undef SPDK_CONFIG_SMA 00:08:38.532 #define SPDK_CONFIG_TESTS 1 00:08:38.532 #undef SPDK_CONFIG_TSAN 00:08:38.532 #define SPDK_CONFIG_UBLK 1 00:08:38.532 #define SPDK_CONFIG_UBSAN 1 00:08:38.532 #undef SPDK_CONFIG_UNIT_TESTS 00:08:38.532 #undef SPDK_CONFIG_URING 00:08:38.532 #define SPDK_CONFIG_URING_PATH 00:08:38.532 #undef SPDK_CONFIG_URING_ZNS 00:08:38.532 #undef SPDK_CONFIG_USDT 00:08:38.532 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:38.532 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:38.532 #define SPDK_CONFIG_VFIO_USER 1 00:08:38.532 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:38.532 #define SPDK_CONFIG_VHOST 1 00:08:38.532 #define SPDK_CONFIG_VIRTIO 1 00:08:38.532 #undef SPDK_CONFIG_VTUNE 00:08:38.532 #define SPDK_CONFIG_VTUNE_DIR 00:08:38.532 #define SPDK_CONFIG_WERROR 1 00:08:38.532 #define SPDK_CONFIG_WPDK_DIR 00:08:38.532 #undef SPDK_CONFIG_XNVME 00:08:38.532 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:38.532 05:33:49 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:38.532 05:33:49 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:38.532 05:33:49 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:38.532 05:33:49 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:38.532 05:33:49 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:38.532 05:33:49 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:38.532 05:33:49 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:38.532 05:33:49 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:38.532 05:33:49 -- paths/export.sh@5 -- # export PATH 00:08:38.532 05:33:49 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:38.532 05:33:49 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:38.532 05:33:49 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:38.532 05:33:49 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:38.532 05:33:49 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:38.532 05:33:49 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:38.532 05:33:49 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:38.532 05:33:49 -- pm/common@16 -- # TEST_TAG=N/A 00:08:38.532 05:33:49 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:38.532 05:33:49 -- common/autotest_common.sh@52 -- # : 1 00:08:38.532 05:33:49 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:38.532 05:33:49 -- common/autotest_common.sh@56 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:38.532 05:33:49 -- common/autotest_common.sh@58 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:38.532 05:33:49 -- common/autotest_common.sh@60 -- # : 1 00:08:38.532 05:33:49 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:38.532 05:33:49 -- common/autotest_common.sh@62 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:38.532 05:33:49 -- common/autotest_common.sh@64 -- # : 00:08:38.532 05:33:49 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:38.532 05:33:49 -- common/autotest_common.sh@66 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:38.532 05:33:49 -- common/autotest_common.sh@68 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:38.532 05:33:49 -- common/autotest_common.sh@70 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:38.532 05:33:49 -- common/autotest_common.sh@72 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:38.532 05:33:49 -- common/autotest_common.sh@74 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:38.532 05:33:49 -- common/autotest_common.sh@76 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:38.532 05:33:49 -- common/autotest_common.sh@78 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:38.532 05:33:49 -- common/autotest_common.sh@80 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:38.532 05:33:49 -- common/autotest_common.sh@82 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:38.532 05:33:49 -- common/autotest_common.sh@84 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:38.532 05:33:49 -- common/autotest_common.sh@86 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:38.532 05:33:49 -- common/autotest_common.sh@88 -- # : 0 00:08:38.532 05:33:49 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:38.533 05:33:49 -- common/autotest_common.sh@90 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:38.533 05:33:49 -- common/autotest_common.sh@92 -- # : 1 00:08:38.533 05:33:49 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:38.533 05:33:49 -- common/autotest_common.sh@94 -- # : 1 00:08:38.533 05:33:49 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:38.533 05:33:49 -- common/autotest_common.sh@96 -- # : rdma 00:08:38.533 05:33:49 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:38.533 05:33:49 -- common/autotest_common.sh@98 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:38.533 05:33:49 -- common/autotest_common.sh@100 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:38.533 05:33:49 -- common/autotest_common.sh@102 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:38.533 05:33:49 -- common/autotest_common.sh@104 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:38.533 05:33:49 -- common/autotest_common.sh@106 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:38.533 05:33:49 -- common/autotest_common.sh@108 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:38.533 05:33:49 -- common/autotest_common.sh@110 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:38.533 05:33:49 -- common/autotest_common.sh@112 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:38.533 05:33:49 -- common/autotest_common.sh@114 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:38.533 05:33:49 -- common/autotest_common.sh@116 -- # : 1 00:08:38.533 05:33:49 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:38.533 05:33:49 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:38.533 05:33:49 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:38.533 05:33:49 -- common/autotest_common.sh@120 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:38.533 05:33:49 -- common/autotest_common.sh@122 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:38.533 05:33:49 -- common/autotest_common.sh@124 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:38.533 05:33:49 -- common/autotest_common.sh@126 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:38.533 05:33:49 -- common/autotest_common.sh@128 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:38.533 05:33:49 -- common/autotest_common.sh@130 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:38.533 05:33:49 -- common/autotest_common.sh@132 -- # : v22.11.4 00:08:38.533 05:33:49 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:38.533 05:33:49 -- common/autotest_common.sh@134 -- # : true 00:08:38.533 05:33:49 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:38.533 05:33:49 -- common/autotest_common.sh@136 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:38.533 05:33:49 -- common/autotest_common.sh@138 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:38.533 05:33:49 -- common/autotest_common.sh@140 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:38.533 05:33:49 -- common/autotest_common.sh@142 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:38.533 05:33:49 -- common/autotest_common.sh@144 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:38.533 05:33:49 -- common/autotest_common.sh@146 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:38.533 05:33:49 -- common/autotest_common.sh@148 -- # : 00:08:38.533 05:33:49 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:38.533 05:33:49 -- common/autotest_common.sh@150 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:38.533 05:33:49 -- common/autotest_common.sh@152 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:38.533 05:33:49 -- common/autotest_common.sh@154 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:38.533 05:33:49 -- common/autotest_common.sh@156 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:38.533 05:33:49 -- common/autotest_common.sh@158 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:38.533 05:33:49 -- common/autotest_common.sh@160 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:38.533 05:33:49 -- common/autotest_common.sh@163 -- # : 00:08:38.533 05:33:49 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:38.533 05:33:49 -- common/autotest_common.sh@165 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:38.533 05:33:49 -- common/autotest_common.sh@167 -- # : 0 00:08:38.533 05:33:49 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:38.533 05:33:49 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:38.533 05:33:49 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:38.533 05:33:49 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:38.533 05:33:49 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:38.533 05:33:49 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:38.533 05:33:49 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:38.533 05:33:49 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:38.533 05:33:49 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:38.533 05:33:49 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:38.533 05:33:49 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:38.533 05:33:49 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:38.533 05:33:49 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:38.533 05:33:49 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:38.533 05:33:49 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:38.533 05:33:49 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:38.533 05:33:49 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:38.533 05:33:49 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:38.533 05:33:49 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:38.533 05:33:49 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:38.533 05:33:49 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:38.533 05:33:49 -- common/autotest_common.sh@196 -- # cat 00:08:38.533 05:33:49 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:38.533 05:33:49 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:38.533 05:33:49 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:38.533 05:33:49 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:38.533 05:33:49 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:38.533 05:33:49 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:38.533 05:33:49 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:38.533 05:33:49 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:38.533 05:33:49 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:38.533 05:33:49 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:38.534 05:33:49 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:38.534 05:33:49 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:38.534 05:33:49 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:38.534 05:33:49 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:38.534 05:33:49 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:38.534 05:33:49 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:38.534 05:33:49 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:38.534 05:33:49 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:38.534 05:33:49 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:38.534 05:33:49 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:08:38.534 05:33:49 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:08:38.534 05:33:49 -- common/autotest_common.sh@249 -- # _LCOV= 00:08:38.534 05:33:49 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:08:38.534 05:33:49 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:08:38.534 05:33:49 -- common/autotest_common.sh@250 -- # _LCOV=1 00:08:38.534 05:33:49 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:38.534 05:33:49 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:08:38.534 05:33:49 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:38.534 05:33:49 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:08:38.534 05:33:49 -- common/autotest_common.sh@259 -- # export valgrind= 00:08:38.534 05:33:49 -- common/autotest_common.sh@259 -- # valgrind= 00:08:38.534 05:33:49 -- common/autotest_common.sh@265 -- # uname -s 00:08:38.793 05:33:49 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:08:38.793 05:33:49 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:08:38.793 05:33:49 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:08:38.793 05:33:49 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:08:38.793 05:33:49 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:38.793 05:33:49 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:38.793 05:33:49 -- common/autotest_common.sh@275 -- # MAKE=make 00:08:38.793 05:33:49 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:08:38.793 05:33:49 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:08:38.793 05:33:49 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:08:38.793 05:33:49 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:38.793 05:33:49 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:08:38.793 05:33:49 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:08:38.793 05:33:49 -- common/autotest_common.sh@319 -- # [[ -z 2224783 ]] 00:08:38.793 05:33:49 -- common/autotest_common.sh@319 -- # kill -0 2224783 00:08:38.793 05:33:49 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:08:38.793 05:33:49 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:08:38.793 05:33:49 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:08:38.793 05:33:49 -- common/autotest_common.sh@332 -- # local mount target_dir 00:08:38.793 05:33:49 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:08:38.793 05:33:49 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:08:38.793 05:33:49 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:08:38.793 05:33:49 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:08:38.793 05:33:49 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.xAIwSk 00:08:38.793 05:33:49 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:38.793 05:33:49 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:08:38.794 05:33:49 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:08:38.794 05:33:49 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.xAIwSk/tests/vfio /tmp/spdk.xAIwSk 00:08:38.794 05:33:49 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:08:38.794 05:33:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:38.794 05:33:49 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:08:38.794 05:33:49 -- common/autotest_common.sh@328 -- # df -T 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:08:38.794 05:33:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:08:38.794 05:33:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:08:38.794 05:33:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:08:38.794 05:33:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=51925254144 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:08:38.794 05:33:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=9805352960 00:08:38.794 05:33:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862708736 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:08:38.794 05:33:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:08:38.794 05:33:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340129792 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:08:38.794 05:33:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=5992448 00:08:38.794 05:33:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=30863441920 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:08:38.794 05:33:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=1863680 00:08:38.794 05:33:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:38.794 05:33:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:08:38.794 05:33:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:08:38.794 05:33:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:08:38.794 05:33:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:38.794 05:33:49 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:08:38.794 * Looking for test storage... 00:08:38.794 05:33:49 -- common/autotest_common.sh@369 -- # local target_space new_size 00:08:38.794 05:33:49 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:08:38.794 05:33:49 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:38.794 05:33:49 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:38.794 05:33:49 -- common/autotest_common.sh@373 -- # mount=/ 00:08:38.794 05:33:49 -- common/autotest_common.sh@375 -- # target_space=51925254144 00:08:38.794 05:33:49 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:08:38.794 05:33:49 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:08:38.794 05:33:49 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:08:38.794 05:33:49 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:08:38.794 05:33:49 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:08:38.794 05:33:49 -- common/autotest_common.sh@382 -- # new_size=12019945472 00:08:38.794 05:33:49 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:38.794 05:33:49 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:38.794 05:33:49 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:38.794 05:33:49 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:38.794 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:38.794 05:33:49 -- common/autotest_common.sh@390 -- # return 0 00:08:38.794 05:33:49 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:08:38.794 05:33:49 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:08:38.794 05:33:49 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:38.794 05:33:49 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:38.794 05:33:49 -- common/autotest_common.sh@1682 -- # true 00:08:38.794 05:33:49 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:08:38.794 05:33:49 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:38.794 05:33:49 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:38.794 05:33:49 -- common/autotest_common.sh@27 -- # exec 00:08:38.794 05:33:49 -- common/autotest_common.sh@29 -- # exec 00:08:38.794 05:33:49 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:38.794 05:33:49 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:38.794 05:33:49 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:38.794 05:33:49 -- common/autotest_common.sh@18 -- # set -x 00:08:38.794 05:33:49 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:38.794 05:33:49 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:38.794 05:33:49 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:38.794 05:33:49 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:38.794 05:33:49 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:38.794 05:33:49 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:38.794 05:33:49 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:38.794 05:33:49 -- scripts/common.sh@335 -- # IFS=.-: 00:08:38.794 05:33:49 -- scripts/common.sh@335 -- # read -ra ver1 00:08:38.794 05:33:49 -- scripts/common.sh@336 -- # IFS=.-: 00:08:38.794 05:33:49 -- scripts/common.sh@336 -- # read -ra ver2 00:08:38.794 05:33:49 -- scripts/common.sh@337 -- # local 'op=<' 00:08:38.794 05:33:49 -- scripts/common.sh@339 -- # ver1_l=2 00:08:38.794 05:33:49 -- scripts/common.sh@340 -- # ver2_l=1 00:08:38.794 05:33:49 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:38.794 05:33:49 -- scripts/common.sh@343 -- # case "$op" in 00:08:38.794 05:33:49 -- scripts/common.sh@344 -- # : 1 00:08:38.794 05:33:49 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:38.794 05:33:49 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:38.794 05:33:49 -- scripts/common.sh@364 -- # decimal 1 00:08:38.794 05:33:49 -- scripts/common.sh@352 -- # local d=1 00:08:38.794 05:33:49 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:38.794 05:33:49 -- scripts/common.sh@354 -- # echo 1 00:08:38.794 05:33:49 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:38.794 05:33:49 -- scripts/common.sh@365 -- # decimal 2 00:08:38.794 05:33:49 -- scripts/common.sh@352 -- # local d=2 00:08:38.794 05:33:49 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:38.794 05:33:49 -- scripts/common.sh@354 -- # echo 2 00:08:38.794 05:33:49 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:38.794 05:33:49 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:38.794 05:33:49 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:38.794 05:33:49 -- scripts/common.sh@367 -- # return 0 00:08:38.794 05:33:49 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:38.794 05:33:49 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:38.794 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:38.794 --rc genhtml_branch_coverage=1 00:08:38.794 --rc genhtml_function_coverage=1 00:08:38.794 --rc genhtml_legend=1 00:08:38.794 --rc geninfo_all_blocks=1 00:08:38.794 --rc geninfo_unexecuted_blocks=1 00:08:38.794 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:38.794 ' 00:08:38.794 05:33:49 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:38.794 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:38.794 --rc genhtml_branch_coverage=1 00:08:38.794 --rc genhtml_function_coverage=1 00:08:38.794 --rc genhtml_legend=1 00:08:38.794 --rc geninfo_all_blocks=1 00:08:38.794 --rc geninfo_unexecuted_blocks=1 00:08:38.794 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:38.794 ' 00:08:38.794 05:33:49 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:38.794 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:38.794 --rc genhtml_branch_coverage=1 00:08:38.794 --rc genhtml_function_coverage=1 00:08:38.794 --rc genhtml_legend=1 00:08:38.794 --rc geninfo_all_blocks=1 00:08:38.794 --rc geninfo_unexecuted_blocks=1 00:08:38.794 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:38.794 ' 00:08:38.794 05:33:49 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:38.794 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:38.794 --rc genhtml_branch_coverage=1 00:08:38.794 --rc genhtml_function_coverage=1 00:08:38.794 --rc genhtml_legend=1 00:08:38.794 --rc geninfo_all_blocks=1 00:08:38.794 --rc geninfo_unexecuted_blocks=1 00:08:38.794 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:38.794 ' 00:08:38.794 05:33:49 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:38.795 05:33:49 -- ../common.sh@8 -- # pids=() 00:08:38.795 05:33:49 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:38.795 05:33:49 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:38.795 05:33:49 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:38.795 05:33:49 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:38.795 05:33:49 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:38.795 05:33:49 -- vfio/run.sh@65 -- # mem_size=0 00:08:38.795 05:33:49 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:38.795 05:33:49 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:38.795 05:33:49 -- ../common.sh@69 -- # local fuzz_num=7 00:08:38.795 05:33:49 -- ../common.sh@70 -- # local time=1 00:08:38.795 05:33:49 -- ../common.sh@72 -- # (( i = 0 )) 00:08:38.795 05:33:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.795 05:33:49 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:38.795 05:33:49 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:38.795 05:33:49 -- vfio/run.sh@23 -- # local timen=1 00:08:38.795 05:33:49 -- vfio/run.sh@24 -- # local core=0x1 00:08:38.795 05:33:49 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:38.795 05:33:49 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:38.795 05:33:49 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:38.795 05:33:49 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:38.795 05:33:49 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:38.795 05:33:49 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:38.795 05:33:49 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:38.795 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:38.795 05:33:49 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:38.795 [2024-11-29 05:33:50.021893] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:38.795 [2024-11-29 05:33:50.021961] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2224846 ] 00:08:38.795 EAL: No free 2048 kB hugepages reported on node 1 00:08:38.795 [2024-11-29 05:33:50.094648] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.054 [2024-11-29 05:33:50.132895] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:39.054 [2024-11-29 05:33:50.133044] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.054 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.054 INFO: Seed: 2156674655 00:08:39.054 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:39.054 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:39.054 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:39.054 INFO: A corpus is not provided, starting from an empty corpus 00:08:39.054 #2 INITED exec/s: 0 rss: 60Mb 00:08:39.054 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:39.054 This may also happen if the target rejected all inputs we tried so far 00:08:39.572 NEW_FUNC[1/631]: 0x450dd8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:39.572 NEW_FUNC[2/631]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:39.572 #5 NEW cov: 10765 ft: 10556 corp: 2/24b lim: 60 exec/s: 0 rss: 65Mb L: 23/23 MS: 3 CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:08:39.831 #7 NEW cov: 10779 ft: 13983 corp: 3/41b lim: 60 exec/s: 0 rss: 67Mb L: 17/23 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:40.090 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:40.090 #8 NEW cov: 10796 ft: 15703 corp: 4/58b lim: 60 exec/s: 0 rss: 68Mb L: 17/23 MS: 1 ShuffleBytes- 00:08:40.349 #9 NEW cov: 10796 ft: 16919 corp: 5/91b lim: 60 exec/s: 9 rss: 68Mb L: 33/33 MS: 1 CrossOver- 00:08:40.349 #10 NEW cov: 10796 ft: 17347 corp: 6/128b lim: 60 exec/s: 10 rss: 68Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:40.609 #16 NEW cov: 10796 ft: 17446 corp: 7/165b lim: 60 exec/s: 16 rss: 68Mb L: 37/37 MS: 1 ShuffleBytes- 00:08:40.868 #17 NEW cov: 10796 ft: 17771 corp: 8/188b lim: 60 exec/s: 17 rss: 68Mb L: 23/37 MS: 1 ChangeBit- 00:08:41.127 #18 NEW cov: 10803 ft: 17930 corp: 9/206b lim: 60 exec/s: 18 rss: 68Mb L: 18/37 MS: 1 EraseBytes- 00:08:41.127 #19 NEW cov: 10803 ft: 18119 corp: 10/244b lim: 60 exec/s: 9 rss: 68Mb L: 38/38 MS: 1 InsertByte- 00:08:41.127 #19 DONE cov: 10803 ft: 18119 corp: 10/244b lim: 60 exec/s: 9 rss: 68Mb 00:08:41.127 Done 19 runs in 2 second(s) 00:08:41.386 05:33:52 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:41.386 05:33:52 -- ../common.sh@72 -- # (( i++ )) 00:08:41.386 05:33:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.386 05:33:52 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:41.386 05:33:52 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:41.386 05:33:52 -- vfio/run.sh@23 -- # local timen=1 00:08:41.386 05:33:52 -- vfio/run.sh@24 -- # local core=0x1 00:08:41.386 05:33:52 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:41.386 05:33:52 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:41.386 05:33:52 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:41.386 05:33:52 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:41.386 05:33:52 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:41.386 05:33:52 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:41.386 05:33:52 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:41.386 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:41.646 05:33:52 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:41.646 [2024-11-29 05:33:52.718569] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:41.646 [2024-11-29 05:33:52.718648] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2225389 ] 00:08:41.646 EAL: No free 2048 kB hugepages reported on node 1 00:08:41.646 [2024-11-29 05:33:52.788430] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.646 [2024-11-29 05:33:52.824660] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:41.646 [2024-11-29 05:33:52.824820] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.906 INFO: Running with entropic power schedule (0xFF, 100). 00:08:41.906 INFO: Seed: 553707339 00:08:41.906 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:41.906 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:41.906 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:41.906 INFO: A corpus is not provided, starting from an empty corpus 00:08:41.906 #2 INITED exec/s: 0 rss: 59Mb 00:08:41.906 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:41.906 This may also happen if the target rejected all inputs we tried so far 00:08:41.906 [2024-11-29 05:33:53.086650] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:41.906 [2024-11-29 05:33:53.086685] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:41.906 [2024-11-29 05:33:53.086704] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:42.166 NEW_FUNC[1/638]: 0x451378 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:42.166 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:42.166 #10 NEW cov: 10781 ft: 10731 corp: 2/38b lim: 40 exec/s: 0 rss: 66Mb L: 37/37 MS: 3 InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:08:42.425 [2024-11-29 05:33:53.489340] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:42.425 [2024-11-29 05:33:53.489377] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:42.425 [2024-11-29 05:33:53.489412] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:42.425 #11 NEW cov: 10796 ft: 13010 corp: 3/47b lim: 40 exec/s: 0 rss: 67Mb L: 9/37 MS: 1 CMP- DE: "\377\377\377\377\377\377\377p"- 00:08:42.425 [2024-11-29 05:33:53.614296] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:42.425 [2024-11-29 05:33:53.614322] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:42.425 [2024-11-29 05:33:53.614356] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:42.425 #13 NEW cov: 10796 ft: 13633 corp: 4/53b lim: 40 exec/s: 0 rss: 68Mb L: 6/37 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:42.684 [2024-11-29 05:33:53.729130] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:42.684 [2024-11-29 05:33:53.729157] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:42.684 [2024-11-29 05:33:53.729177] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:42.684 #14 NEW cov: 10796 ft: 13963 corp: 5/89b lim: 40 exec/s: 0 rss: 68Mb L: 36/37 MS: 1 EraseBytes- 00:08:42.684 [2024-11-29 05:33:53.843042] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:42.684 [2024-11-29 05:33:53.843067] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:42.685 [2024-11-29 05:33:53.843086] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:42.685 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:42.685 #15 NEW cov: 10813 ft: 15040 corp: 6/98b lim: 40 exec/s: 0 rss: 68Mb L: 9/37 MS: 1 ShuffleBytes- 00:08:42.685 [2024-11-29 05:33:53.957795] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:42.685 [2024-11-29 05:33:53.957820] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:42.685 [2024-11-29 05:33:53.957839] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:42.944 #16 NEW cov: 10813 ft: 15147 corp: 7/104b lim: 40 exec/s: 16 rss: 68Mb L: 6/37 MS: 1 EraseBytes- 00:08:42.944 [2024-11-29 05:33:54.073502] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:42.944 [2024-11-29 05:33:54.073526] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:42.944 [2024-11-29 05:33:54.073544] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:42.944 #17 NEW cov: 10813 ft: 15423 corp: 8/141b lim: 40 exec/s: 17 rss: 68Mb L: 37/37 MS: 1 ChangeBinInt- 00:08:42.944 [2024-11-29 05:33:54.188432] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:42.944 [2024-11-29 05:33:54.188457] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:42.944 [2024-11-29 05:33:54.188476] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:43.203 #18 NEW cov: 10813 ft: 15503 corp: 9/151b lim: 40 exec/s: 18 rss: 68Mb L: 10/37 MS: 1 InsertByte- 00:08:43.203 [2024-11-29 05:33:54.303288] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:43.203 [2024-11-29 05:33:54.303312] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:43.203 [2024-11-29 05:33:54.303331] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:43.203 #21 NEW cov: 10813 ft: 15747 corp: 10/155b lim: 40 exec/s: 21 rss: 68Mb L: 4/37 MS: 3 EraseBytes-ShuffleBytes-InsertByte- 00:08:43.203 [2024-11-29 05:33:54.418198] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:43.203 [2024-11-29 05:33:54.418223] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:43.203 [2024-11-29 05:33:54.418241] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:43.203 #22 NEW cov: 10813 ft: 15980 corp: 11/172b lim: 40 exec/s: 22 rss: 68Mb L: 17/37 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377p"- 00:08:43.462 [2024-11-29 05:33:54.532990] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:43.462 [2024-11-29 05:33:54.533015] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:43.462 [2024-11-29 05:33:54.533034] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:43.462 #23 NEW cov: 10813 ft: 16026 corp: 12/209b lim: 40 exec/s: 23 rss: 68Mb L: 37/37 MS: 1 ChangeBit- 00:08:43.462 [2024-11-29 05:33:54.647799] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:43.462 [2024-11-29 05:33:54.647823] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:43.462 [2024-11-29 05:33:54.647841] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:43.462 #24 NEW cov: 10813 ft: 16133 corp: 13/246b lim: 40 exec/s: 24 rss: 68Mb L: 37/37 MS: 1 ChangeBit- 00:08:43.462 [2024-11-29 05:33:54.762800] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:43.462 [2024-11-29 05:33:54.762826] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:43.462 [2024-11-29 05:33:54.762845] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:43.722 #25 NEW cov: 10820 ft: 16184 corp: 14/285b lim: 40 exec/s: 25 rss: 69Mb L: 39/39 MS: 1 CopyPart- 00:08:43.722 [2024-11-29 05:33:54.877584] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:43.722 [2024-11-29 05:33:54.877619] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:43.722 [2024-11-29 05:33:54.877654] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:43.722 #26 NEW cov: 10820 ft: 16206 corp: 15/303b lim: 40 exec/s: 26 rss: 69Mb L: 18/39 MS: 1 EraseBytes- 00:08:43.722 [2024-11-29 05:33:54.992452] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:43.722 [2024-11-29 05:33:54.992477] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:43.722 [2024-11-29 05:33:54.992495] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:43.983 #27 NEW cov: 10820 ft: 16219 corp: 16/339b lim: 40 exec/s: 13 rss: 69Mb L: 36/39 MS: 1 CopyPart- 00:08:43.983 #27 DONE cov: 10820 ft: 16219 corp: 16/339b lim: 40 exec/s: 13 rss: 69Mb 00:08:43.983 ###### Recommended dictionary. ###### 00:08:43.983 "\377\377\377\377\377\377\377p" # Uses: 1 00:08:43.984 ###### End of recommended dictionary. ###### 00:08:43.984 Done 27 runs in 2 second(s) 00:08:44.249 05:33:55 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:44.249 05:33:55 -- ../common.sh@72 -- # (( i++ )) 00:08:44.249 05:33:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.249 05:33:55 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:44.250 05:33:55 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:44.250 05:33:55 -- vfio/run.sh@23 -- # local timen=1 00:08:44.250 05:33:55 -- vfio/run.sh@24 -- # local core=0x1 00:08:44.250 05:33:55 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:44.250 05:33:55 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:44.250 05:33:55 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:44.250 05:33:55 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:44.250 05:33:55 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:44.250 05:33:55 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:44.250 05:33:55 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:44.250 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:44.250 05:33:55 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:44.250 [2024-11-29 05:33:55.368154] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:44.250 [2024-11-29 05:33:55.368249] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2225933 ] 00:08:44.250 EAL: No free 2048 kB hugepages reported on node 1 00:08:44.250 [2024-11-29 05:33:55.440345] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.250 [2024-11-29 05:33:55.475458] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:44.250 [2024-11-29 05:33:55.475604] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.510 INFO: Running with entropic power schedule (0xFF, 100). 00:08:44.510 INFO: Seed: 3196688970 00:08:44.510 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:44.510 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:44.510 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:44.510 INFO: A corpus is not provided, starting from an empty corpus 00:08:44.510 #2 INITED exec/s: 0 rss: 59Mb 00:08:44.510 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:44.510 This may also happen if the target rejected all inputs we tried so far 00:08:44.510 [2024-11-29 05:33:55.728916] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:45.029 NEW_FUNC[1/636]: 0x451d68 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:45.029 NEW_FUNC[2/636]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:45.029 #7 NEW cov: 10762 ft: 10369 corp: 2/64b lim: 80 exec/s: 0 rss: 66Mb L: 63/63 MS: 5 ChangeByte-ShuffleBytes-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:08:45.029 [2024-11-29 05:33:56.133002] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:45.029 #12 NEW cov: 10776 ft: 13007 corp: 3/107b lim: 80 exec/s: 0 rss: 67Mb L: 43/63 MS: 5 ChangeBit-ChangeBinInt-InsertByte-EraseBytes-CrossOver- 00:08:45.029 [2024-11-29 05:33:56.256997] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:45.029 #13 NEW cov: 10776 ft: 13987 corp: 4/170b lim: 80 exec/s: 0 rss: 68Mb L: 63/63 MS: 1 ChangeBit- 00:08:45.287 [2024-11-29 05:33:56.371019] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:45.288 #14 NEW cov: 10776 ft: 14949 corp: 5/213b lim: 80 exec/s: 0 rss: 68Mb L: 43/63 MS: 1 CopyPart- 00:08:45.288 [2024-11-29 05:33:56.485829] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:45.288 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:45.288 #15 NEW cov: 10793 ft: 15084 corp: 6/253b lim: 80 exec/s: 0 rss: 68Mb L: 40/63 MS: 1 InsertRepeatedBytes- 00:08:45.546 [2024-11-29 05:33:56.600223] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:45.546 #16 NEW cov: 10793 ft: 15218 corp: 7/261b lim: 80 exec/s: 16 rss: 68Mb L: 8/63 MS: 1 CrossOver- 00:08:45.546 [2024-11-29 05:33:56.712626] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:45.546 #17 NEW cov: 10793 ft: 15375 corp: 8/325b lim: 80 exec/s: 17 rss: 68Mb L: 64/64 MS: 1 InsertByte- 00:08:45.546 [2024-11-29 05:33:56.826373] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:45.805 #18 NEW cov: 10793 ft: 15401 corp: 9/333b lim: 80 exec/s: 18 rss: 68Mb L: 8/64 MS: 1 ShuffleBytes- 00:08:45.805 [2024-11-29 05:33:56.940168] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:45.805 #19 NEW cov: 10793 ft: 15464 corp: 10/396b lim: 80 exec/s: 19 rss: 68Mb L: 63/64 MS: 1 ChangeBit- 00:08:45.805 [2024-11-29 05:33:57.054683] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:45.805 [2024-11-29 05:33:57.054717] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:46.064 NEW_FUNC[1/2]: 0x1330b08 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:08:46.064 NEW_FUNC[2/2]: 0x1330da8 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:08:46.064 #20 NEW cov: 10806 ft: 15900 corp: 11/426b lim: 80 exec/s: 20 rss: 68Mb L: 30/64 MS: 1 InsertRepeatedBytes- 00:08:46.064 [2024-11-29 05:33:57.179114] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:46.064 #21 NEW cov: 10806 ft: 15963 corp: 12/434b lim: 80 exec/s: 21 rss: 68Mb L: 8/64 MS: 1 CrossOver- 00:08:46.064 #22 NEW cov: 10806 ft: 16567 corp: 13/465b lim: 80 exec/s: 22 rss: 68Mb L: 31/64 MS: 1 InsertByte- 00:08:46.323 [2024-11-29 05:33:57.406746] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:46.323 #23 NEW cov: 10806 ft: 16759 corp: 14/505b lim: 80 exec/s: 23 rss: 69Mb L: 40/64 MS: 1 ChangeByte- 00:08:46.323 #24 NEW cov: 10813 ft: 16900 corp: 15/536b lim: 80 exec/s: 24 rss: 69Mb L: 31/64 MS: 1 ChangeBinInt- 00:08:46.583 [2024-11-29 05:33:57.635082] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:46.583 #25 NEW cov: 10813 ft: 17241 corp: 16/579b lim: 80 exec/s: 12 rss: 69Mb L: 43/64 MS: 1 ChangeByte- 00:08:46.583 #25 DONE cov: 10813 ft: 17241 corp: 16/579b lim: 80 exec/s: 12 rss: 69Mb 00:08:46.583 Done 25 runs in 2 second(s) 00:08:46.842 05:33:57 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:46.842 05:33:57 -- ../common.sh@72 -- # (( i++ )) 00:08:46.842 05:33:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:46.842 05:33:57 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:46.842 05:33:57 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:46.842 05:33:57 -- vfio/run.sh@23 -- # local timen=1 00:08:46.842 05:33:57 -- vfio/run.sh@24 -- # local core=0x1 00:08:46.842 05:33:57 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:46.842 05:33:57 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:46.842 05:33:57 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:46.842 05:33:57 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:46.842 05:33:57 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:46.842 05:33:57 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:46.842 05:33:57 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:46.842 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:46.842 05:33:57 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:46.842 [2024-11-29 05:33:58.006525] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:46.842 [2024-11-29 05:33:58.006627] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2226259 ] 00:08:46.842 EAL: No free 2048 kB hugepages reported on node 1 00:08:46.842 [2024-11-29 05:33:58.077899] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.842 [2024-11-29 05:33:58.114821] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:46.842 [2024-11-29 05:33:58.114970] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.102 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.102 INFO: Seed: 1547729019 00:08:47.102 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:47.102 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:47.102 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:47.102 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.102 #2 INITED exec/s: 0 rss: 60Mb 00:08:47.102 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.102 This may also happen if the target rejected all inputs we tried so far 00:08:47.620 NEW_FUNC[1/632]: 0x452458 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:47.620 NEW_FUNC[2/632]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:47.620 #6 NEW cov: 10742 ft: 10413 corp: 2/63b lim: 320 exec/s: 0 rss: 66Mb L: 62/62 MS: 4 CopyPart-InsertByte-EraseBytes-InsertRepeatedBytes- 00:08:47.879 #7 NEW cov: 10761 ft: 13546 corp: 3/126b lim: 320 exec/s: 0 rss: 67Mb L: 63/63 MS: 1 CrossOver- 00:08:47.879 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:47.879 #13 NEW cov: 10778 ft: 15549 corp: 4/190b lim: 320 exec/s: 0 rss: 68Mb L: 64/64 MS: 1 InsertByte- 00:08:48.138 #14 NEW cov: 10778 ft: 16019 corp: 5/309b lim: 320 exec/s: 0 rss: 68Mb L: 119/119 MS: 1 CopyPart- 00:08:48.398 #15 NEW cov: 10778 ft: 16047 corp: 6/512b lim: 320 exec/s: 15 rss: 68Mb L: 203/203 MS: 1 InsertRepeatedBytes- 00:08:48.398 #16 NEW cov: 10778 ft: 16499 corp: 7/575b lim: 320 exec/s: 16 rss: 68Mb L: 63/203 MS: 1 CrossOver- 00:08:48.657 #17 NEW cov: 10778 ft: 16644 corp: 8/708b lim: 320 exec/s: 17 rss: 68Mb L: 133/203 MS: 1 InsertRepeatedBytes- 00:08:48.916 #18 NEW cov: 10778 ft: 16874 corp: 9/841b lim: 320 exec/s: 18 rss: 69Mb L: 133/203 MS: 1 ChangeBit- 00:08:48.916 #19 NEW cov: 10785 ft: 17057 corp: 10/960b lim: 320 exec/s: 19 rss: 69Mb L: 119/203 MS: 1 ChangeByte- 00:08:49.175 #20 NEW cov: 10785 ft: 17367 corp: 11/1141b lim: 320 exec/s: 10 rss: 69Mb L: 181/203 MS: 1 InsertRepeatedBytes- 00:08:49.175 #20 DONE cov: 10785 ft: 17367 corp: 11/1141b lim: 320 exec/s: 10 rss: 69Mb 00:08:49.175 Done 20 runs in 2 second(s) 00:08:49.434 05:34:00 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:49.434 05:34:00 -- ../common.sh@72 -- # (( i++ )) 00:08:49.434 05:34:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.434 05:34:00 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:49.434 05:34:00 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:49.434 05:34:00 -- vfio/run.sh@23 -- # local timen=1 00:08:49.434 05:34:00 -- vfio/run.sh@24 -- # local core=0x1 00:08:49.434 05:34:00 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:49.434 05:34:00 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:49.434 05:34:00 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:49.434 05:34:00 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:49.434 05:34:00 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:49.434 05:34:00 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:49.434 05:34:00 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:49.434 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:49.434 05:34:00 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:49.434 [2024-11-29 05:34:00.626965] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:49.434 [2024-11-29 05:34:00.627037] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2226774 ] 00:08:49.434 EAL: No free 2048 kB hugepages reported on node 1 00:08:49.434 [2024-11-29 05:34:00.698615] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.434 [2024-11-29 05:34:00.735199] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:49.434 [2024-11-29 05:34:00.735345] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.694 INFO: Running with entropic power schedule (0xFF, 100). 00:08:49.694 INFO: Seed: 4158748542 00:08:49.694 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:49.694 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:49.694 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:49.694 INFO: A corpus is not provided, starting from an empty corpus 00:08:49.694 #2 INITED exec/s: 0 rss: 60Mb 00:08:49.694 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:49.694 This may also happen if the target rejected all inputs we tried so far 00:08:50.213 NEW_FUNC[1/632]: 0x452cd8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:50.213 NEW_FUNC[2/632]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:50.213 #12 NEW cov: 10754 ft: 10718 corp: 2/123b lim: 320 exec/s: 0 rss: 66Mb L: 122/122 MS: 5 CopyPart-ChangeByte-ChangeBinInt-CrossOver-InsertRepeatedBytes- 00:08:50.471 #25 NEW cov: 10768 ft: 13992 corp: 3/178b lim: 320 exec/s: 0 rss: 67Mb L: 55/122 MS: 3 CrossOver-CrossOver-InsertRepeatedBytes- 00:08:50.729 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:50.729 #26 NEW cov: 10785 ft: 15265 corp: 4/214b lim: 320 exec/s: 0 rss: 68Mb L: 36/122 MS: 1 EraseBytes- 00:08:50.729 [2024-11-29 05:34:01.884293] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:50.729 [2024-11-29 05:34:01.884338] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:50.729 [2024-11-29 05:34:01.884349] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:50.729 [2024-11-29 05:34:01.884367] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:50.729 [2024-11-29 05:34:01.885292] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:50.729 [2024-11-29 05:34:01.885312] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:50.729 [2024-11-29 05:34:01.885329] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:50.729 NEW_FUNC[1/6]: 0x1330b08 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:08:50.729 NEW_FUNC[2/6]: 0x1330da8 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:08:50.729 #28 NEW cov: 10819 ft: 16276 corp: 5/271b lim: 320 exec/s: 28 rss: 68Mb L: 57/122 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:50.989 #29 NEW cov: 10819 ft: 16604 corp: 6/364b lim: 320 exec/s: 29 rss: 68Mb L: 93/122 MS: 1 InsertRepeatedBytes- 00:08:51.248 #30 NEW cov: 10819 ft: 16818 corp: 7/486b lim: 320 exec/s: 30 rss: 68Mb L: 122/122 MS: 1 CopyPart- 00:08:51.248 [2024-11-29 05:34:02.482175] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:51.248 [2024-11-29 05:34:02.482202] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:51.248 [2024-11-29 05:34:02.482213] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:51.248 [2024-11-29 05:34:02.482245] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:51.248 [2024-11-29 05:34:02.483191] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:51.248 [2024-11-29 05:34:02.483210] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:51.248 [2024-11-29 05:34:02.483226] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:51.507 #31 NEW cov: 10819 ft: 17082 corp: 8/656b lim: 320 exec/s: 31 rss: 69Mb L: 170/170 MS: 1 InsertRepeatedBytes- 00:08:51.507 #32 NEW cov: 10826 ft: 17361 corp: 9/719b lim: 320 exec/s: 32 rss: 69Mb L: 63/170 MS: 1 CMP- DE: "\377\372.\031\000 \000\000"- 00:08:51.765 [2024-11-29 05:34:02.867306] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:51.765 [2024-11-29 05:34:02.867332] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:51.765 [2024-11-29 05:34:02.867343] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:51.765 [2024-11-29 05:34:02.867375] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:51.765 [2024-11-29 05:34:02.868329] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:51.765 [2024-11-29 05:34:02.868348] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:51.765 [2024-11-29 05:34:02.868365] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:51.765 #33 NEW cov: 10826 ft: 17466 corp: 10/776b lim: 320 exec/s: 16 rss: 69Mb L: 57/170 MS: 1 ChangeByte- 00:08:51.765 #33 DONE cov: 10826 ft: 17466 corp: 10/776b lim: 320 exec/s: 16 rss: 69Mb 00:08:51.765 ###### Recommended dictionary. ###### 00:08:51.765 "\377\372.\031\000 \000\000" # Uses: 0 00:08:51.765 ###### End of recommended dictionary. ###### 00:08:51.765 Done 33 runs in 2 second(s) 00:08:52.024 05:34:03 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:52.024 05:34:03 -- ../common.sh@72 -- # (( i++ )) 00:08:52.024 05:34:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.024 05:34:03 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:52.024 05:34:03 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:52.024 05:34:03 -- vfio/run.sh@23 -- # local timen=1 00:08:52.024 05:34:03 -- vfio/run.sh@24 -- # local core=0x1 00:08:52.024 05:34:03 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:52.024 05:34:03 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:52.024 05:34:03 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:52.024 05:34:03 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:52.024 05:34:03 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:52.025 05:34:03 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:52.025 05:34:03 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:52.025 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:52.025 05:34:03 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:52.025 [2024-11-29 05:34:03.277637] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:52.025 [2024-11-29 05:34:03.277731] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2227314 ] 00:08:52.025 EAL: No free 2048 kB hugepages reported on node 1 00:08:52.284 [2024-11-29 05:34:03.350516] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.284 [2024-11-29 05:34:03.385757] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:52.284 [2024-11-29 05:34:03.385897] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.284 INFO: Running with entropic power schedule (0xFF, 100). 00:08:52.284 INFO: Seed: 2512762111 00:08:52.284 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:52.284 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:52.284 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:52.284 INFO: A corpus is not provided, starting from an empty corpus 00:08:52.284 #2 INITED exec/s: 0 rss: 60Mb 00:08:52.284 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:52.284 This may also happen if the target rejected all inputs we tried so far 00:08:52.542 [2024-11-29 05:34:03.661644] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.542 [2024-11-29 05:34:03.661689] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.801 NEW_FUNC[1/637]: 0x4536d8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:52.801 NEW_FUNC[2/637]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:52.801 #4 NEW cov: 10769 ft: 10565 corp: 2/14b lim: 120 exec/s: 0 rss: 65Mb L: 13/13 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:53.059 [2024-11-29 05:34:04.109419] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.059 [2024-11-29 05:34:04.109465] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.059 NEW_FUNC[1/1]: 0x1c3b968 in _get_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:328 00:08:53.059 #5 NEW cov: 10795 ft: 13106 corp: 3/66b lim: 120 exec/s: 0 rss: 67Mb L: 52/52 MS: 1 InsertRepeatedBytes- 00:08:53.059 [2024-11-29 05:34:04.306559] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.059 [2024-11-29 05:34:04.306591] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.318 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:53.318 #9 NEW cov: 10812 ft: 14662 corp: 4/121b lim: 120 exec/s: 0 rss: 68Mb L: 55/55 MS: 4 ShuffleBytes-InsertByte-InsertByte-CrossOver- 00:08:53.318 [2024-11-29 05:34:04.505326] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.318 [2024-11-29 05:34:04.505358] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.318 #10 NEW cov: 10815 ft: 14760 corp: 5/173b lim: 120 exec/s: 10 rss: 68Mb L: 52/55 MS: 1 ChangeBit- 00:08:53.576 [2024-11-29 05:34:04.692136] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.576 [2024-11-29 05:34:04.692166] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.576 #13 NEW cov: 10815 ft: 15089 corp: 6/230b lim: 120 exec/s: 13 rss: 68Mb L: 57/57 MS: 3 ChangeBit-InsertByte-CrossOver- 00:08:53.835 [2024-11-29 05:34:04.879983] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.835 [2024-11-29 05:34:04.880017] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.835 #14 NEW cov: 10815 ft: 15618 corp: 7/286b lim: 120 exec/s: 14 rss: 68Mb L: 56/57 MS: 1 InsertByte- 00:08:53.835 [2024-11-29 05:34:05.065849] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.835 [2024-11-29 05:34:05.065879] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.093 #18 NEW cov: 10815 ft: 15845 corp: 8/321b lim: 120 exec/s: 18 rss: 68Mb L: 35/57 MS: 4 CopyPart-ShuffleBytes-CopyPart-CrossOver- 00:08:54.093 [2024-11-29 05:34:05.254103] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.093 [2024-11-29 05:34:05.254133] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.093 #23 NEW cov: 10815 ft: 16193 corp: 9/418b lim: 120 exec/s: 23 rss: 68Mb L: 97/97 MS: 5 ShuffleBytes-CopyPart-EraseBytes-InsertByte-InsertRepeatedBytes- 00:08:54.351 [2024-11-29 05:34:05.440176] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.351 [2024-11-29 05:34:05.440205] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.351 #24 NEW cov: 10822 ft: 16550 corp: 10/537b lim: 120 exec/s: 24 rss: 68Mb L: 119/119 MS: 1 InsertRepeatedBytes- 00:08:54.351 [2024-11-29 05:34:05.628268] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.351 [2024-11-29 05:34:05.628298] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.610 #25 NEW cov: 10822 ft: 16714 corp: 11/592b lim: 120 exec/s: 12 rss: 68Mb L: 55/119 MS: 1 ChangeBit- 00:08:54.610 #25 DONE cov: 10822 ft: 16714 corp: 11/592b lim: 120 exec/s: 12 rss: 68Mb 00:08:54.610 Done 25 runs in 2 second(s) 00:08:54.869 05:34:05 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:54.869 05:34:05 -- ../common.sh@72 -- # (( i++ )) 00:08:54.869 05:34:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:54.869 05:34:05 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:54.869 05:34:05 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:54.869 05:34:05 -- vfio/run.sh@23 -- # local timen=1 00:08:54.869 05:34:05 -- vfio/run.sh@24 -- # local core=0x1 00:08:54.869 05:34:05 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:54.869 05:34:05 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:54.869 05:34:05 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:54.869 05:34:06 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:54.869 05:34:06 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:54.870 05:34:06 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:54.870 05:34:06 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:54.870 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:54.870 05:34:06 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:54.870 [2024-11-29 05:34:06.039140] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:54.870 [2024-11-29 05:34:06.039210] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2227786 ] 00:08:54.870 EAL: No free 2048 kB hugepages reported on node 1 00:08:54.870 [2024-11-29 05:34:06.110830] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:54.870 [2024-11-29 05:34:06.146589] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:54.870 [2024-11-29 05:34:06.146747] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.129 INFO: Running with entropic power schedule (0xFF, 100). 00:08:55.129 INFO: Seed: 980801835 00:08:55.129 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:55.129 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:55.129 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:55.129 INFO: A corpus is not provided, starting from an empty corpus 00:08:55.129 #2 INITED exec/s: 0 rss: 60Mb 00:08:55.129 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:55.129 This may also happen if the target rejected all inputs we tried so far 00:08:55.129 [2024-11-29 05:34:06.424658] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.129 [2024-11-29 05:34:06.424701] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.647 NEW_FUNC[1/638]: 0x4543c8 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:55.647 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:55.647 #21 NEW cov: 10772 ft: 10671 corp: 2/10b lim: 90 exec/s: 0 rss: 66Mb L: 9/9 MS: 4 InsertByte-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:55.647 [2024-11-29 05:34:06.877961] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.647 [2024-11-29 05:34:06.878005] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.906 #22 NEW cov: 10786 ft: 14064 corp: 3/19b lim: 90 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:55.906 [2024-11-29 05:34:07.070092] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.906 [2024-11-29 05:34:07.070122] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.906 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:55.906 #28 NEW cov: 10803 ft: 15416 corp: 4/28b lim: 90 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CMP- DE: "\002\000\000\000"- 00:08:56.165 [2024-11-29 05:34:07.260426] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.165 [2024-11-29 05:34:07.260457] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.165 #29 NEW cov: 10803 ft: 16446 corp: 5/37b lim: 90 exec/s: 29 rss: 68Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:56.165 [2024-11-29 05:34:07.448160] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.165 [2024-11-29 05:34:07.448194] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.424 #30 NEW cov: 10803 ft: 16919 corp: 6/46b lim: 90 exec/s: 30 rss: 68Mb L: 9/9 MS: 1 ChangeBit- 00:08:56.424 [2024-11-29 05:34:07.637649] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.424 [2024-11-29 05:34:07.637680] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.683 #31 NEW cov: 10803 ft: 17092 corp: 7/59b lim: 90 exec/s: 31 rss: 68Mb L: 13/13 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:56.683 [2024-11-29 05:34:07.828108] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.683 [2024-11-29 05:34:07.828138] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.683 #32 NEW cov: 10803 ft: 17579 corp: 8/98b lim: 90 exec/s: 32 rss: 68Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:56.942 [2024-11-29 05:34:08.015770] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.942 [2024-11-29 05:34:08.015799] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.942 #33 NEW cov: 10803 ft: 17731 corp: 9/175b lim: 90 exec/s: 33 rss: 68Mb L: 77/77 MS: 1 InsertRepeatedBytes- 00:08:56.942 [2024-11-29 05:34:08.204077] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.942 [2024-11-29 05:34:08.204107] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.200 #34 NEW cov: 10810 ft: 18014 corp: 10/199b lim: 90 exec/s: 34 rss: 68Mb L: 24/77 MS: 1 InsertRepeatedBytes- 00:08:57.200 [2024-11-29 05:34:08.395940] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.200 [2024-11-29 05:34:08.395971] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.459 #35 NEW cov: 10810 ft: 18241 corp: 11/208b lim: 90 exec/s: 17 rss: 69Mb L: 9/77 MS: 1 ChangeBit- 00:08:57.459 #35 DONE cov: 10810 ft: 18241 corp: 11/208b lim: 90 exec/s: 17 rss: 69Mb 00:08:57.459 ###### Recommended dictionary. ###### 00:08:57.459 "\002\000\000\000" # Uses: 1 00:08:57.459 ###### End of recommended dictionary. ###### 00:08:57.459 Done 35 runs in 2 second(s) 00:08:57.718 05:34:08 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:57.718 05:34:08 -- ../common.sh@72 -- # (( i++ )) 00:08:57.718 05:34:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:57.718 05:34:08 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:57.718 00:08:57.718 real 0m19.188s 00:08:57.718 user 0m26.596s 00:08:57.718 sys 0m1.867s 00:08:57.718 05:34:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:57.718 05:34:08 -- common/autotest_common.sh@10 -- # set +x 00:08:57.718 ************************************ 00:08:57.718 END TEST vfio_fuzz 00:08:57.718 ************************************ 00:08:57.718 00:08:57.718 real 1m23.265s 00:08:57.718 user 2m6.104s 00:08:57.718 sys 0m10.190s 00:08:57.718 05:34:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:57.718 05:34:08 -- common/autotest_common.sh@10 -- # set +x 00:08:57.718 ************************************ 00:08:57.718 END TEST llvm_fuzz 00:08:57.718 ************************************ 00:08:57.718 05:34:08 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:08:57.718 05:34:08 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:08:57.718 05:34:08 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:08:57.718 05:34:08 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:57.718 05:34:08 -- common/autotest_common.sh@10 -- # set +x 00:08:57.718 05:34:08 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:08:57.718 05:34:08 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:08:57.718 05:34:08 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:08:57.718 05:34:08 -- common/autotest_common.sh@10 -- # set +x 00:09:04.289 INFO: APP EXITING 00:09:04.289 INFO: killing all VMs 00:09:04.289 INFO: killing vhost app 00:09:04.289 INFO: EXIT DONE 00:09:06.858 Waiting for block devices as requested 00:09:06.858 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:06.858 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:06.858 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:06.858 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:06.858 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:07.117 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:07.117 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:07.117 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:07.117 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:07.376 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:07.376 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:07.376 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:07.634 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:07.634 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:07.634 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:07.893 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:07.893 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:11.174 Cleaning 00:09:11.174 Removing: /dev/shm/spdk_tgt_trace.pid2190427 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2187949 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2189209 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2190427 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2191225 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2191558 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2191885 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2192237 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2192583 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2192874 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2193158 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2193476 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2194336 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2197474 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2197837 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2198143 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2198183 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2198731 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2198989 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2199379 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2199581 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2199879 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2200045 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2200194 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2200465 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2200848 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2201126 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2201414 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2201688 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2201884 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2202062 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2202130 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2202399 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2202680 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2202832 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2203015 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2203255 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2203538 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2203808 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2204098 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2204328 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2204520 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2204677 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2204953 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2205221 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2205507 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2205782 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2206036 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2206180 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2206374 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2206641 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2206928 00:09:11.174 Removing: /var/run/dpdk/spdk_pid2207195 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2207484 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2207688 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2207885 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2208057 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2208347 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2208618 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2208899 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2209167 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2209381 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2209527 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2209756 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2210035 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2210322 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2210591 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2210882 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2211062 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2211248 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2211454 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2211747 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2211967 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2212152 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2212908 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2213420 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2213745 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2214282 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2214795 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2215118 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2215782 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2216489 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2217067 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2217598 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2218121 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2218429 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2218972 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2219423 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2219809 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2220346 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2220759 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2221177 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2221715 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2222012 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2222554 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2222994 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2223388 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2223928 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2224219 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2224846 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2225389 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2225933 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2226259 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2226774 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2227314 00:09:11.431 Removing: /var/run/dpdk/spdk_pid2227786 00:09:11.431 Clean 00:09:11.689 killing process with pid 2140044 00:09:15.877 killing process with pid 2140040 00:09:15.877 killing process with pid 2140042 00:09:15.877 killing process with pid 2140041 00:09:15.877 05:34:26 -- common/autotest_common.sh@1446 -- # return 0 00:09:15.877 05:34:26 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:09:15.877 05:34:26 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:15.877 05:34:26 -- common/autotest_common.sh@10 -- # set +x 00:09:15.877 05:34:26 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:09:15.877 05:34:26 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:15.877 05:34:26 -- common/autotest_common.sh@10 -- # set +x 00:09:15.877 05:34:26 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:15.877 05:34:26 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:15.877 05:34:26 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:15.877 05:34:26 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:09:15.877 05:34:26 -- spdk/autotest.sh@383 -- # hostname 00:09:15.877 05:34:26 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:15.877 geninfo: WARNING: invalid characters removed from testname! 00:09:16.446 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:09:16.446 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:09:16.447 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:09:28.665 05:34:37 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:33.944 05:34:44 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:38.142 05:34:49 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:43.411 05:34:53 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:47.597 05:34:58 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:52.870 05:35:03 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:57.059 05:35:08 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:57.059 05:35:08 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:09:57.059 05:35:08 -- common/autotest_common.sh@1690 -- $ lcov --version 00:09:57.059 05:35:08 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:09:57.059 05:35:08 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:09:57.059 05:35:08 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:09:57.059 05:35:08 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:09:57.059 05:35:08 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:09:57.059 05:35:08 -- scripts/common.sh@335 -- $ IFS=.-: 00:09:57.059 05:35:08 -- scripts/common.sh@335 -- $ read -ra ver1 00:09:57.059 05:35:08 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:57.059 05:35:08 -- scripts/common.sh@336 -- $ read -ra ver2 00:09:57.059 05:35:08 -- scripts/common.sh@337 -- $ local 'op=<' 00:09:57.059 05:35:08 -- scripts/common.sh@339 -- $ ver1_l=2 00:09:57.059 05:35:08 -- scripts/common.sh@340 -- $ ver2_l=1 00:09:57.059 05:35:08 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:09:57.059 05:35:08 -- scripts/common.sh@343 -- $ case "$op" in 00:09:57.059 05:35:08 -- scripts/common.sh@344 -- $ : 1 00:09:57.059 05:35:08 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:09:57.059 05:35:08 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:57.059 05:35:08 -- scripts/common.sh@364 -- $ decimal 1 00:09:57.059 05:35:08 -- scripts/common.sh@352 -- $ local d=1 00:09:57.059 05:35:08 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:57.059 05:35:08 -- scripts/common.sh@354 -- $ echo 1 00:09:57.059 05:35:08 -- scripts/common.sh@364 -- $ ver1[v]=1 00:09:57.059 05:35:08 -- scripts/common.sh@365 -- $ decimal 2 00:09:57.059 05:35:08 -- scripts/common.sh@352 -- $ local d=2 00:09:57.059 05:35:08 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:57.059 05:35:08 -- scripts/common.sh@354 -- $ echo 2 00:09:57.059 05:35:08 -- scripts/common.sh@365 -- $ ver2[v]=2 00:09:57.059 05:35:08 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:09:57.059 05:35:08 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:09:57.059 05:35:08 -- scripts/common.sh@367 -- $ return 0 00:09:57.059 05:35:08 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:57.059 05:35:08 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:09:57.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:57.059 --rc genhtml_branch_coverage=1 00:09:57.059 --rc genhtml_function_coverage=1 00:09:57.059 --rc genhtml_legend=1 00:09:57.059 --rc geninfo_all_blocks=1 00:09:57.059 --rc geninfo_unexecuted_blocks=1 00:09:57.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:57.059 ' 00:09:57.059 05:35:08 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:09:57.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:57.059 --rc genhtml_branch_coverage=1 00:09:57.059 --rc genhtml_function_coverage=1 00:09:57.059 --rc genhtml_legend=1 00:09:57.059 --rc geninfo_all_blocks=1 00:09:57.059 --rc geninfo_unexecuted_blocks=1 00:09:57.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:57.059 ' 00:09:57.059 05:35:08 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:09:57.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:57.059 --rc genhtml_branch_coverage=1 00:09:57.059 --rc genhtml_function_coverage=1 00:09:57.059 --rc genhtml_legend=1 00:09:57.059 --rc geninfo_all_blocks=1 00:09:57.059 --rc geninfo_unexecuted_blocks=1 00:09:57.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:57.059 ' 00:09:57.059 05:35:08 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:09:57.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:57.059 --rc genhtml_branch_coverage=1 00:09:57.059 --rc genhtml_function_coverage=1 00:09:57.059 --rc genhtml_legend=1 00:09:57.059 --rc geninfo_all_blocks=1 00:09:57.059 --rc geninfo_unexecuted_blocks=1 00:09:57.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:57.059 ' 00:09:57.059 05:35:08 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:57.059 05:35:08 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:57.059 05:35:08 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:57.059 05:35:08 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:57.059 05:35:08 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:57.059 05:35:08 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:57.059 05:35:08 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:57.059 05:35:08 -- paths/export.sh@5 -- $ export PATH 00:09:57.059 05:35:08 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:57.059 05:35:08 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:57.059 05:35:08 -- common/autobuild_common.sh@440 -- $ date +%s 00:09:57.059 05:35:08 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732854908.XXXXXX 00:09:57.059 05:35:08 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732854908.QIPzVt 00:09:57.059 05:35:08 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:09:57.059 05:35:08 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:09:57.059 05:35:08 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:57.059 05:35:08 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:57.059 05:35:08 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:57.059 05:35:08 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:57.059 05:35:08 -- common/autobuild_common.sh@456 -- $ get_config_params 00:09:57.060 05:35:08 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:09:57.060 05:35:08 -- common/autotest_common.sh@10 -- $ set +x 00:09:57.060 05:35:08 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:57.060 05:35:08 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:57.060 05:35:08 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:57.060 05:35:08 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:57.060 05:35:08 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:57.060 05:35:08 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:57.060 05:35:08 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:57.060 05:35:08 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:57.060 05:35:08 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:57.060 05:35:08 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:57.060 05:35:08 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:57.060 + [[ -n 2084168 ]] 00:09:57.060 + sudo kill 2084168 00:09:57.328 [Pipeline] } 00:09:57.344 [Pipeline] // stage 00:09:57.350 [Pipeline] } 00:09:57.364 [Pipeline] // timeout 00:09:57.369 [Pipeline] } 00:09:57.385 [Pipeline] // catchError 00:09:57.392 [Pipeline] } 00:09:57.410 [Pipeline] // wrap 00:09:57.416 [Pipeline] } 00:09:57.430 [Pipeline] // catchError 00:09:57.439 [Pipeline] stage 00:09:57.441 [Pipeline] { (Epilogue) 00:09:57.454 [Pipeline] catchError 00:09:57.459 [Pipeline] { 00:09:57.471 [Pipeline] echo 00:09:57.472 Cleanup processes 00:09:57.478 [Pipeline] sh 00:09:57.761 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:57.761 2237112 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:57.777 [Pipeline] sh 00:09:58.065 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:58.065 ++ grep -v 'sudo pgrep' 00:09:58.065 ++ awk '{print $1}' 00:09:58.065 + sudo kill -9 00:09:58.065 + true 00:09:58.078 [Pipeline] sh 00:09:58.368 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:58.368 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:58.368 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:59.742 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:09.732 [Pipeline] sh 00:10:10.016 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:10.016 Artifacts sizes are good 00:10:10.031 [Pipeline] archiveArtifacts 00:10:10.039 Archiving artifacts 00:10:10.243 [Pipeline] sh 00:10:10.588 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:10.614 [Pipeline] cleanWs 00:10:10.624 [WS-CLEANUP] Deleting project workspace... 00:10:10.624 [WS-CLEANUP] Deferred wipeout is used... 00:10:10.630 [WS-CLEANUP] done 00:10:10.632 [Pipeline] } 00:10:10.649 [Pipeline] // catchError 00:10:10.662 [Pipeline] sh 00:10:10.944 + logger -p user.info -t JENKINS-CI 00:10:10.953 [Pipeline] } 00:10:10.966 [Pipeline] // stage 00:10:10.972 [Pipeline] } 00:10:10.986 [Pipeline] // node 00:10:10.991 [Pipeline] End of Pipeline 00:10:11.034 Finished: SUCCESS